🌐
Maxon Knowledge Base
support.maxon.net › hc › en-us › articles › 1500006456521-CPU-GPU-and-Hardware-Recommendations-for-Redshift
CPU, GPU, and Hardware Recommendations for Redshift – Knowledge Base
March 4, 2026 - If the CPU will be driving four or more GPUs or batch-rendering multiple frames at once, a higher-performance CPU such as the Intel Core i7 is recommended. ... CPUs with operating frequencies of 3.5GHz or more. It’s better to have a CPU with fewer cores but more GHz than more cores and low GHz (i.e. an 8 core 2.5GHz CPU will be considerably worse for Redshift compared to a 6 core 3.5 GHz CPU).
🌐
Maxon Knowledge Base
support.maxon.net › hc › en-us › articles › 23810462477852-Understanding-how-Redshift-GPU-CPU-licenses-work-with-your-personal-workstation-and-render-farm-nodes
Understanding how Redshift GPU & CPU licenses work with your personal workstation and render farm nodes – Knowledge Base
November 24, 2025 - Any extra machine running Team Render Client or the Commandline Renderer requires its own separate Redshift license to use Redshift GPU. Redshift CPU is included with Team Render Client and Commandline Renderer and does not require an additional license.
Discussions

Redshift is rendering slow and seems to use more cpu power than gpu - Deadline - AWS Thinkbox Discussion Forums
The render times seem very slow compared to what it could render from one pc just using maya and redshift. Another odd thing is that my cpu’s are hitting about 85 - 90% useage and the gpu’s are around 15 - 20%. Shouldn’t that be the other way around using redshift? More on forums.thinkboxsoftware.com
🌐 forums.thinkboxsoftware.com
0
February 20, 2019
Is Redshift CPU mode only good enough for rendering in cinema4D?
Redshift's CPU mode is not meant to be used as a renderer on it's own. It's a free boost to the GPU capabilities and a really poor one at that. It's main purpose is to give you a free way to render with redshift that is bundled with c4d, then they can rope you into a subscription for redshift. If CPU is all you have, then Arnold would be a better choice I think, but you should get a trial and compare both TBH. More on reddit.com
🌐 r/RedshiftRenderer
7
2
January 23, 2023
Redshift only using 10% of GPU ?
1st. How much memory you dedicate to your render engine doesn't influence the level of GPU utilization. 2nd. CPU has also an impact on your gpus. FE, I have 41080ti and 2Xeon 2696v3 with 72 threads, but only 2,8ghz each. Other guy has also 4*1080ti, but ordinary i7 with 16 threads. Each is 4ghz turbo. His render times are 10-20% smaller. 3rd. Scalability of video cards in render engines is different. FE, Octane uses 90-100% of every gpu in my rig, while Redshift only uses 50-60%. Hope that will help you. But in the end, you're right. Only 10% is weird More on reddit.com
🌐 r/RedshiftRenderer
5
5
June 6, 2018
[Redshift] CPU checked under Compute Devices?
It’s just a feature they added a few versions back just as an option for those who don’t have a GPU. If you have a GPU there is no scenario where it helps you. Just leave it off More on reddit.com
🌐 r/Cinema4D
9
1
April 23, 2023
🌐
Puget Systems
pugetsystems.com › home › solutions › rendering › redshift › hardware recommendations
Hardware Recommendations for Redshift | Puget Systems
September 11, 2025 - Redshift is a fully GPU-based rendering engine. This means that the video cards (or GPUs) in your system are what impacts how long renders take to complete, rather than the CPU.
🌐
Maxon
help.maxon.net › r3d › maya › en-us › Content › html › Redshift+CPU+Rendering.html
Redshift CPU
Lists the available GPU and CPU hardware and controls which devices are enabled for rendering in Redshift.
🌐
iRender
irendering.net › home › blog › redshift cpu rendering: should we use it?
Redshift CPU rendering: Should we use it? | Redshift render farm
January 26, 2024 - This Redshift version only supports CPU rendering. If you purchase Redshift and install it as a separate product or get it as part of the MAXON One subscription, Redshift can also use the GPU(s) or even CPU and GPU together.
🌐
iRender
irendering.net › home › blog › how redshift leverages hardware: gpu, cpu & memory requirements explained
How Redshift leverages hardware: GPU, CPU & Memory requirements explained
November 22, 2023 - The downside of TCC is that once enabled, the GPU becomes “invisible” to Windows and 3D apps like Maya, Houdini, etc and only works with CUDA apps like Redshift. Only Quadro, Tesla, and Titan GPUs support TCC. GeForce GTX cards cannot be used it. TCC is only useful for Windows – Linux does not need it as the Linux display driver does not have the communication latency issues of WDDM on Windows. In other words, CPU-GPU communication is faster by default on Linux than Windows (with WDDM) across all NVIDIA GPUs including GeForce and Quadro/Tesla/Titan GPUs.
🌐
Maxon Knowledge Base
support.maxon.net › hc › en-us › articles › 1500006456521-Does-Redshift-depend-on-CPU-performance
Does Redshift depend on CPU performance? – Knowledge Base
September 7, 2023 - Since Redshift is a GPU renderer, it mostly depends on GPU performance. There are, however, certain processing stages that happen during rendering which are dependent on the performance of the CPU, disk or network. These include extracting mesh data from your 3d app, loading textures from disk and preparing the scene data for use by the GPU.
🌐
Techgage
techgage.com › article › maxon-redshift-3-5-gpu-cpu-rendering-performance
Maxon Redshift 3.5 GPU & CPU Rendering Performance – Techgage
July 7, 2022 - Since the last time we took an in-depth look at Redshift performance, the engine gained CPU support, so we couldn’t help but give that a quick test. What we found was kind of expected: GPUs are still the go-to for the fastest, and most efficient performance.
Find elsewhere
🌐
CGDirector
cgdirector.com › home › benchmarks › redshift benchmark results (updated)
Redshift Benchmark Results (Updated) - CG Director
April 11, 2024 - Extensive Redshift Benchmark Results List with all modern GPUs, Operating Systems, CPUs and Multi-GPU-Setups. Find the best performing GPU for Redshift.
🌐
Graphisoft Community
community.graphisoft.com › english › forum › visualization
Redshift is not using the GPU - a solution - Graphisoft Community
January 3, 2024 - When I changed computer and upgraded to Ac27, my Redshift render scenes suddenly became very slow. When monitoring the cpu and gpu usage I could see that Redshift was'nt using the gpu at all. This was strange since Redshift renders on the graphics card and not the cpu.
🌐
CGDirector
cgdirector.com › home › 3d animation & rendering › redshift system requirements & pc-recommendations
Redshift System Requirements & PC-Recommendations
November 4, 2023 - For a new build, a CPU like the Intel i5-13600K or Ryzen 5 7600X would be perfect. It has great single-core performance for Redshift and plenty of cores for other applications that might actually use them. If you then find yourself craving even more power, you can easily pop in an i7 or i9 down the road as well. But here’s the caveat on multi-GPU rigs: CPUs with high single-core performance are ones meant for mainstream consumers, mostly gamers.
🌐
AWS Thinkbox Discussion Forums
forums.thinkboxsoftware.com › deadline
Redshift is rendering slow and seems to use more cpu power than gpu - Deadline - AWS Thinkbox Discussion Forums
February 20, 2019 - The render times seem very slow compared to what it could render from one pc just using maya and redshift. Another odd thing is that my cpu’s are hitting about 85 - 90% useage and the gpu’s are around 15 - 20%. Shouldn’t that be the other way around using redshift?
🌐
Reddit
reddit.com › r/cinema4d › [redshift] cpu checked under compute devices?
r/Cinema4D on Reddit: [Redshift] CPU checked under Compute Devices?
April 23, 2023 -

Hi everyone,

I built a new rig to start using Redshift more often, and when I started using it I noticed it was quite sluggish and slow. Mouse lag whenever I was adding a new light, RS preview got stuck and was unable to continue using it without restarting C4D... proper crap performance. I was noticing my CPU (13900K) had 100% usage in the Task Manager, but the GPU (a 4080) was rather unused.

I just unchecked the CPU under the Redshift Compute devices in the C4D preferences and it started working like a charm. I'm really happy it seems to have solved the issue, but my question is: AFAIK Redshift is quite GPU-centric, correct? Why would the CPU be checked as default, given CPU render engines operate differently? Is this a case of Maxon doing weird stuff or is keeping the CPU on necessary?

Thanks! Sorry if my question is a very newbie one

🌐
Cinebench
cinebench.net › home
Cinebench 2024, R23, R20, R15 - All Version [Download Now]
January 18, 2026 - Measure per-core responsiveness and full-thread throughput to compare CPUs fairly, validate boosts, and identify scheduler or throttling issues. Cinebench 2024 adds a Redshift-based GPU test to profile modern graphics performance for rendering, previews, and content creation pipelines.
🌐
80.lv
80.lv › articles › redshift-3-5-released
Redshift 3.5 Released
April 22, 2022 - Maxon launched Redshift 3.5 – an update to its GPU-accelerated renderer. Previously GPU-only, the new version's Redshift CPU and Redshift XPU now allow rendering on the CPU or the CPU and GPU at the same time.
🌐
Workstation Specialists
workstationspecialist.com › home › solutions › rendering › recommended computer workstation for redshift
Recommended Computer Workstation For Redshift - Workstation Specialists Ltd
January 12, 2026 - Key: High-clock CPU + professional GPU + sufficient RAM + NVMe storage = smooth Solid Edge performance. ... Unleashing the true power of GPU rendering has become a pursuit of many creators. Driven by the great leap in GPU capabilities over the last ten years. A highly optimised Workstation, tuned for a GPU-accelerated render engine like Redshift...
🌐
Reddit
reddit.com › r/cinema4d › question about redshift cpu & gpu
r/Cinema4D on Reddit: Question about Redshift CPU & GPU
February 21, 2023 -

Hello. I'm working on a scene that's pretty heavy render time wise. Right now it sits at 19 minutes per frame. I'm using Redshift GPU with my RTX 2060 SUPER. I know it's pretty low end card, but recently I upgraded my CPU to Ryzen 5950X and I was wondering if Redshift CPU would be a better solution? I don't think there's much to do with optimizing the scene, before my optimisation it was nearly 2 hours a frame :). Will the scene work "plug&play" with Redshift CPU, or do I need to alter it? Thanks in advance.

🌐
CGDirector
cgdirector.com › home › 3d animation & rendering › best hardware for gpu rendering in octane – redshift – vray (updated)
Best Hardware for GPU Rendering in Octane - Redshift - Vray (Updated)
October 6, 2023 - If you just need a quick ... and can recommend them without hesitation. ... To use Octane and Redshift you will need a GPU that has CUDA-Cores, meaning you will need an NVIDIA GPU....