News How Much CPU Does the GeForce RTX 3080 Need?

SpitfireMk3

Honorable
Aug 14, 2014
5
0
10,520
1
Thank you for this great analysis. Obviously you have to consider all aspects before upgrading and I'm glad that I can still use my PC for a while before being forced to upgrade. Seeing the 9 series with other RAM outperform the 10 in some tests, how much difference do you think we'll see when DDR5 hits the scene?
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
500
388
560
0
I'm curious why the i-5 10600k wasn't part of this test. In this article you list it as your top pick for gaming PC's, and I would have liked to see how it compares to the i-9 at high/ultra settings.
The biggest reason: I don't have one. Paul has a 10600K, as he's the CPU guy. And I bought the i3-10100 because I was more interested in seeing what happened to a 4-core/8-thread modern Intel CPU (and it was only $130 with tax). I might have an i7-8700K around that I could test, but I've been running a ton of benchmarks for the 3080 launch, and still have lots of things to take care of, so probably it's not happening any time soon.
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
500
388
560
0
Thank you for this great analysis. Obviously you have to consider all aspects before upgrading and I'm glad that I can still use my PC for a while before being forced to upgrade. Seeing the 9 series with other RAM outperform the 10 in some tests, how much difference do you think we'll see when DDR5 hits the scene?
DDR5 will probably be like the DDR3/DDR4 rollout: not much faster at first, but over time the increased bandwidth becomes more important. A lot will depend on the CPU architecture it's paired with of course, and we don't have Zen 4 or Alder Lake yet. :)
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
500
388
560
0
Christ, so even the 3080 is incapable of keeping up with modern CPUs at 1080 ultra.
That's a 25% improvement in FPS for the 9900k going to 1080 medium (daheck?! did you run them at stock with TDP enforced? )
It's 25% faster at 1080p medium because both the CPU and GPU are doing less work overall. There are a few games where the ultra vs. medium stuff still hits similar speeds, but you'd need significantly more CPU performance to avoid at least a moderate drop in performance. There are games where 1080p ultra is almost completely CPU limited (MS Flight Simulator for sure), others where the graphics complexity is high enough that they're only partially CPU limited, and still others where they're basically GPU limited even at 1080p. Enabling ray tracing in most games pushes them into the last category.

As for TDP, the 10900K TDP is very loosely defined and the MSI board almost certainly runs it with PL2 constraints rather than restricting it to 125W. For games, it's not coming close to 250W, but it might be breaking 125W. The same applies to the other CPUs to varying degrees. I don't have proper CPU power testing equipment, however, so at best I could try and do Outlet Power - GPU Power. Which I don't have time to try and get done right now.
 
Reactions: Neog2

drewthebrave

Reputable
Aug 16, 2017
2
4
4,525
1
This is really encouraging! I have an old PC build that has been trucking along with an i7 4930K and a GTX 780 which has held up great for 1080p gaming. However, with my brand new 4K TV, I've been waiting patiently to see how the new generation of GPUs perform with older hardware so I can finally do some 4K HDR PC gaming! These results make me feel better about my plan to pick up a 3070 or 3080 soon and then upgrade the rest of my PC sometime next year when my budget allows.
 

Neog2

Distinguished
Sep 7, 2007
152
1
18,715
9
The Title for your 4K Ultra results says 1440p 4K Ultra I was super confused. I thought you guys where (downsampling/Supersampling ) or something. After reading the text below there was no mention of that so i just assumed it was a mistake. Besides that great review. Exactly what I was looking for and you actually had games that are Relevant to me. RDR2, and Division 2.

Only thing I need now is those DLSS and Ray Tracing on at Same Time, verses DLSS by it self results.
Patiently Waiting Until you post some results.

Keep up the great work @JarredWaltonGPU
 
Reactions: JarredWaltonGPU
Apr 30, 2020
11
8
15
0
I have a limited selection of CPUs -- Paul is the CPU tester. He could maybe do XT chips. We're working to get him a new GPU for testing ... soon. :)
Thanks.
I guess that between the new GPUs and the next CPU generation I can finally retire my i3770k and RX 590 setup in January, it served me almost flawlessy since 2013 and will retire after a record 8 years, more than my very first PC in 1988.
 

mac_angel

Distinguished
Mar 12, 2008
283
4
18,785
0
Ummmm. Those older CPUs you mentioned, saying they weren't going to be fast enough. I'm not sure if you bypassed the reason or overlooked it. The RTX 3080 is PCIe Gen4. Those older CPUs are PCIe Gen2. Not Gen3, but Gen2. That's a HUGE difference.
 
Reactions: spentshells

cma6

Distinguished
Feb 22, 2006
53
0
18,530
0
If one intends to use RTX 3080 for calculation only, e.g., chess analysis, then would the relative results per CPU be about the same for calculation throughput as for gaming? I have a 2013 Intel i7-4930k. For an older CPU like mine, would RTX 3070 make more sense than the RTX 3080?
 
Last edited:

Neog2

Distinguished
Sep 7, 2007
152
1
18,715
9
Ummmm. Those older CPUs you mentioned, saying they weren't going to be fast enough. I'm not sure if you bypassed the reason or overlooked it. The RTX 3080 is PCIe Gen4. Those older CPUs are PCIe Gen2. Not Gen3, but Gen2. That's a HUGE difference.
The Core i9-10900K, and Core i3-10100 are still using Gen 3 PCI-e with the Z490 chipset.

But older CPU's are limited not only by speed and efficiency but via the features they support and can use. Older CPU's will never support Gen 4 or even Gen 3 depending on how far you go back hence its part of the Bottleneck equation. So yes Even though they are fast they still struggle greatly due to other inherent handicaps of the CPU's capability as a whole.

Yes its a Huge Difference and Its a Huge Difference that has no other solutions besides Play at higher resolutions than 1080p preferably 4k, DownSample Render the game at 4k but display it at 1080p, or upgrade computer.
 
Last edited:

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
500
388
560
0
The Title for your 4K Ultra results says 1440p 4K Ultra I was super confused. I thought you guys where (downsampling/Supersampling ) or something. After reading the text below there was no mention of that so i just assumed it was a mistake. Besides that great review. Exactly what I was looking for and you actually had games that are Relevant to me. RDR2, and Division 2.

Only thing I need now is those DLSS and Ray Tracing on at Same Time, verses DLSS by it self results.
Patiently Waiting Until you post some results.

Keep up the great work @JarredWaltonGPU
Fixed the 1440p 4K typo (copy / paste / edit fail!) I did 'bonus coverage' of RT + DLSS in the full 3080 review for games that support it.
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
500
388
560
0
Ummmm. Those older CPUs you mentioned, saying they weren't going to be fast enough. I'm not sure if you bypassed the reason or overlooked it. The RTX 3080 is PCIe Gen4. Those older CPUs are PCIe Gen2. Not Gen3, but Gen2. That's a HUGE difference.
It's a huge bandwidth difference, but when you consider the GPU has 760GBps of bandwidth while the PCIe bus even at Gen4 speeds is only 31.5GBps (15.75GBps for Gen3 and 8GBps for Gen2), you can see that it actually doesn't matter that much. I'm sure the lack of CPU speed is going to matter more than Gen2 PCIe. Still, you should buy an RTX 3080 to use it with an Ivy Bridge or FX-series (or earlier) CPU. That's far too unbalanced. RTX 2080 would probably be nearly as fast.
 

hotaru.hino

Proper
Sep 1, 2020
205
66
170
2
Ummmm. Those older CPUs you mentioned, saying they weren't going to be fast enough. I'm not sure if you bypassed the reason or overlooked it. The RTX 3080 is PCIe Gen4. Those older CPUs are PCIe Gen2. Not Gen3, but Gen2. That's a HUGE difference.
The only time PCIe bandwidth really matters is if the video card has to load/unload a lot of data. For the most part, everything a game needs is contained within its VRAM and most of the traffic is the CPU sending commands to the GPU, which isn't much. Even if VRAM appears completely filled, some of that data is not immediately needed and may be swapped to and from system RAM to little or no performance loss. The problem comes when VRAM is filled and data that is actually needed is being swapped. This is what causes the performance issues with the 4GB Radeon RX 5500.

GPU-z, at least for NVIDIA GPUs, has a bus utilization statistic. If you're not saturating that, PCIe generation largely doesn't matter.
 

cgigoux

Reputable
Oct 14, 2015
5
1
4,515
0
Great review. Interesting to see that while there is a larger performance hit with the old CPU's at 1080p vs 1440p, even then the games are more than playable with almost none of them dropping below 60fps. I'm idly wondering if an old AMD 83xx series could keep up.

Also, when are you going to review the "Ultra POS" case you used to test the Z97 CPU. Hadn't heard of that brand before (though there are several cases I've seen that may qualify). ;-)
 
Reactions: JarredWaltonGPU

ASK THE COMMUNITY