🌐
CGDirector
cgdirector.com › home › benchmarks › redshift benchmark results (updated)
Redshift Benchmark Results (Updated) - CG Director
April 11, 2024 - Extensive Redshift Benchmark Results List with all modern GPUs, Operating Systems, CPUs and Multi-GPU-Setups. Find the best performing GPU for Redshift.
🌐
Maxon
help.maxon.net › r3d › maya › en-us › Content › html › The+redshiftBenchmark+tool.html
The redshiftBenchmark tool
Redshift at that point will attempt ... print something like this: Result: Redshift 3.0.30 (Windows) CPU: 32 threads, 3.39 GHz, 63.88 GB GPU(s): [GeForce RTX 2080 Ti 11 GB 0.027ms (RTX ON)], [GeForce RTX 2080 Ti 11 GB 0.017ms (RTX ON)] Time: 00h:02m:28s...
Discussions

New Nvidia generation for rendering
Cuda Cores: 3090 RTX: 10496 4090 RTX: 16384 5090 RTX: 21760 3080 RTX: 8960 4080 RTX: 9728 5080 RTX: 10752 This should be pretty good indicator what to expect. More on reddit.com
🌐 r/RedshiftRenderer
14
13
January 12, 2025
Redshift on new M4's?
I have been a Mac user for 20+ years and love working on Redshift. There’s a bit of false info in some of these comments. Apple Silicon does do GPU rendering, not CPU rendering. Apple’s CUDA equivalent is Metal, and Redshift now natively supports this. The M3 series saw a big improvement in Redshift benchmark scores, the M1 was terrible - so you may be pleasantly surprised. Nobody has posted M4 scores yet, so it’s hard to say how they compare - but the M3 Max was somewhere around a 3070-3080 ( Cinebench scores - scroll to GPU results). So it’s not the greatest, but it’s a laptop. Personally speaking, I love being able to get these sorts of results when working on a portable computer, but when I need serious power I’m still going to be sending to an NVidia rig. So what am I saying? Like others here - it sounds like you need a render rig, regardless of what computers you actually do the work on. I think you should sell it in to management as a separate server - especially given they seem weirdly hell bent on dictating what hardware you use day to day. Something like this might scare them , in terms of budget, but you could always build something yourself for less. More on reddit.com
🌐 r/RedshiftRenderer
46
6
November 2, 2024
🌐
CG Channel
cgchannel.com › 2017 › 06 › redshift-gets-new-gpu-benchmarking-utility
Redshift gets new GPU benchmarking utility | CG Channel
It’s also interesting to compare results from workstation and gaming cards: at the minute, the fastest single and dual-GPU scores are from Nvidia’s top-of-the-range workstation card, the Quadro GP100. However, the much less expensive Titan X Pascal and GeForce GTX 1080 Ti gaming cards aren’t far behind: the difference in performance looks to be in the range of 5-15%. And obviously, given that Redshift is a CUDA renderer, there are no scores for AMD cards. Availability The benchmark utility is available as part of the latest Redshift 2.5.19 alpha builds.
🌐
iRender
irendering.net › redshift-benchmark-results-2022-irender
Redshift Benchmark Results (2022) - iRender
June 14, 2022 - According to the CGDirector test, we got a surprising result: 4 GPUs RTX 3090 (Ampere architecture) outperforming 8 GPUs RTX 2080Ti – the previous Pascal generation of NVIDIA GeForce graphics card.
🌐
Legit Reviews
legitreviews.com › redshift benchmark gpu render times with geforce rtx 2070, 2080 & 2080 ti
Redshift Benchmark GPU Render Times with GeForce RTX 2070, 2080 & 2080 Ti - Legit Reviews
October 26, 2018 - Without further ado the Redshift 2.6.23 benchmark render times on our test system: Performance scaling on the GPUs that we tested looks great.
🌐
Legit Reviews
legitreviews.com › redshift v3.0.22 benchmarks with hardware-accelerated gpu scheduling
Redshift v3.0.22 Benchmarks With Hardware-Accelerated GPU Scheduling - Legit Reviews
July 10, 2020 - Today we are armed with a Redshift 3.0 license and will be using the built-in benchmark scene in Redshift v3.0.22 to test nearly all of the current GeForce GTX and RTX offerings from NVIDIA.
🌐
Techgage
techgage.com › article › maxon-redshift-3-5-gpu-cpu-rendering-performance
Maxon Redshift 3.5 GPU & CPU Rendering Performance – Techgage
July 7, 2022 - If a tested NVIDIA GPU includes RT cores, then the Redshift benchmark will automatically opt to use them. This is a good thing, of course, as those cores help greatly accelerate ray tracing workloads, which includes 3D rendering.
🌐
iRender
irendering.net › tag › redshift-gpu-benchmark
redshift gpu benchmark Archives | iRender Cloud Rendering Service
GPU cloud rendering farm for Blender, Redshift, Maya, and more. Powered by RTX 4090. From $9/hour.
Find elsewhere
🌐
Techgage
techgage.com › article › maxon-cinebench-2024-redshift-gpu-rendering-performance
Maxon Cinebench 2024 & Redshift GPU Rendering Performance – Techgage
October 2, 2023 - Four years ago, Maxon acquired Redshift, and fast-forward to today, it’s Redshift that acts as the backend for Cinebench’s new GPU rendering test. In this article, we’re going to post performance results from our Cinebench testing, as well as include performance for two separate Redshift scenes, tested through Cinema 4D 2024. While we love the simplicity of Cinebench, we wish it was designed to test more than one scene, because not all scene renders will scale the same way across-the-board. Our Redshift results will help pad things out. All of the benchmarking conducted for this article was completed using an up-to-date Windows 11 (22H2), the latest Intel chipset driver, as well as the latest (as of the time of testing) graphics driver.
🌐
Reddit
reddit.com › r/redshiftrenderer › new nvidia generation for rendering
r/RedshiftRenderer on Reddit: New Nvidia generation for rendering
January 12, 2025 -

Hi!

Just like everyone else I’m curious to see what kind of performance boost the new gen Nvidia cards will provide for rendering. The only benchmark so far is gaming, showing an almost 2x boost in fps in Cyberpunk at 4K (which is insane). But my concern is if that boost is mainly driven by the DLSS 4 tech using ai for higher frame rate.

Basicallt I’m wondering if anyone has done any deep dive in the specs or has any early hands on experience with these cards and know if the boost for rendering will be as massive as for gaming.

🌐
Cinebench
cinebench.net › home
Cinebench 2024, R23, R20, R15 - All Version [Download Now]
January 18, 2026 - Cinebench is Maxon’s free, industry-standard benchmark that measures the real-world performance of your CPU and GPU using the powerful Cinema 4D rendering engine. Trusted by reviewers, PC builders, and creative professionals, it provides repeatable, cross-platform scores for single-core, multi-core, and GPU (Redshift) workloads.
🌐
Puget Systems
pugetsystems.com › home › hardware articles › redshift: nvidia geforce rtx 40 series performance
Redshift: NVIDIA GeForce RTX 40 Series Performance | Puget Systems
November 15, 2023 - As we can see, the NVIDIA GeForce RTX 40 Series GPUs are currently the fastest GPUs available for rendering in Redshift. The newest release, the RTX 4070 Ti 12GB, is faster than even the fastest GPU from the previous generation RTX 30 Series.
🌐
iRender
irendering.net › home › blog › gpu recommendations for redshift 2025
GPU Recommendations for Redshift 2025 | Redshift render farm
May 13, 2025 - The powerful capabilities of modern GPUs enable quick rendering times and allow for greater complexity in projects. In this blog, iRender will explore key GPU recommendations for Redshift in 2025, focusing on specifications, performance benchmarks, and the specific needs of different users.
🌐
OpenBenchmarking.org
openbenchmarking.org › test › pts › redshift
RedShift Demo Benchmark - OpenBenchmarking.org
September 12, 2020 - You must manually download the ... registration at RedShift3d.com. To run this test with the Phoronix Test Suite, the basic command is: phoronix-test-suite benchmark redshift....
🌐
CGDirector
cgdirector.com › home › benchmarks
Benchmarks - CGDirector
The new Version of Cinebench, 2024, benchmarks CPUs and GPUs in the popular Redshift render engine.
🌐
Puget Systems
pugetsystems.com › home › solutions › rendering › redshift › hardware recommendations
Hardware Recommendations for Redshift | Puget Systems
September 11, 2025 - Better multi-GPU support – thanks to the use of blower-style cooling systems and more constrained power consumption · ECC memory on higher-end models – for increased stability · Yes! Redshift scales very well with multiple cards and can significantly improve your render times.
🌐
Puget Systems
pugetsystems.com › home › hardware articles › redshift 3.0 – nvidia geforce rtx 3080 & 3090 performance
Redshift 3.0 - NVIDIA GeForce RTX 3080 & 3090 Performance | Puget Systems
September 25, 2020 - RTX 3090 24GB – This card is about 15% faster for rendering, with 140% more onboard memory and support for NVLink. That means it will be much better suited to working with large scenes and detailed geometry.
🌐
iRender
irendering.net › top-gpu-for-redshift-octane-and-v-ray-in-2023
Top GPU for Redshift, Octane and V-Ray in 2023 | Redshift render farm
May 12, 2023 - It’s safe to say that RTX3090 ... VRAM. According to the tests above, we can clearly say that RTX4090 is the best GPU for Redshift, Octane and V-Ray rendering....
🌐
Facebook
facebook.com › groups › RedshiftRender › posts › 927999540982590
Anyone know How can I find Redshift benchmark ...
Popular groups · Find communities for you · Over 1 billion people across the globe are using Facebook Groups to explore their favorite topics · Log in · Categories · Science & tech · Travel · Animals · Sports & fitness · Entertainment
🌐
Reddit
reddit.com › r/redshiftrenderer › redshift on new m4's?
r/RedshiftRenderer on Reddit: Redshift on new M4's?
November 2, 2024 -

Long story short, at job we render on M1 Max 64GB MBP. It's slow and unsustainable for final rendering sequences and the turnaround time we need.

I've been pushing them to look into getting a Windows build with RTX4090's if they want to see a real, tangible difference in render times and get the most out of Redshift, since it's Cuda based and Apple Silicon isn't.

They were open to pricing one out until the new M4's were announced. Now higher ups just want to go with the new M4's because "Mac is what we've always used".

If we get them, we're stuck with them for a while.

Will the M4 be comparable to a typical Windows+NVIDIA RTX build for Redshift when rendering out final image sequences?

The M1 Max's have been awful in terms final frame render time, and ends up taking way too long to render sequences for the turnaround time we need in order to work efficiently.

I'm resistant to continue in the Mac ecosystem for rendering out of Redshift. Apple Silicon is great for AE, Editing, and Photoshop, but GPU rendering is it's kryptonite.

Will the M4's be trash compared to a proper Windows build? Or will they be better? If they are at least equivalent to a proper windows build, great. If not, seems like a waste of money/time.

Top answer
1 of 23
7
Keep pushing for the pc rig. Work on the scene on the macbook and jump in with parsec on the render rig. Argue about upgradability. You can easily add a second GPU in the future. You can also use your macbook longer since it does not need to be as powerful or big. Multiple artists can push renders to the pc from their mobile workstations. This is the way
2 of 23
6
I have been a Mac user for 20+ years and love working on Redshift. There’s a bit of false info in some of these comments. Apple Silicon does do GPU rendering, not CPU rendering. Apple’s CUDA equivalent is Metal, and Redshift now natively supports this. The M3 series saw a big improvement in Redshift benchmark scores, the M1 was terrible - so you may be pleasantly surprised. Nobody has posted M4 scores yet, so it’s hard to say how they compare - but the M3 Max was somewhere around a 3070-3080 ( Cinebench scores - scroll to GPU results). So it’s not the greatest, but it’s a laptop. Personally speaking, I love being able to get these sorts of results when working on a portable computer, but when I need serious power I’m still going to be sending to an NVidia rig. So what am I saying? Like others here - it sounds like you need a render rig, regardless of what computers you actually do the work on. I think you should sell it in to management as a separate server - especially given they seem weirdly hell bent on dictating what hardware you use day to day. Something like this might scare them , in terms of budget, but you could always build something yourself for less.