News RTX 3080 Smashes Through Doom Eternal at Over 100 FPS in 4K

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

atomicWAR

Glorious
Ambassador
I'm going to assume "stumble" means stuttering or whatnot. In which case, that depends on the CPU moreso than the GPU. It's in the best interest of a GPU company to showcase their product in GPU dependent scenarios.

If you're stuttering in GPU dependent scenarios though, well, you likely realize you're due for a GPU upgrade anyway.

Not really, not on the CPU side. Games where the CPU is the bottleneck and not the GPU, are by far in the minority by a good margin. When you say things like stuttering in the first sentence (i see you added a more accurate qualifier later), which implies a sub 60fps frame rate, whether it's average's or 1% lows, unless your CPU is either truly ancient, seriously thread starved or worse you playing poorly optimized game that also happens to be a thread hog like COD Warzone on a very old CPU, what your saying generally just isn't the case. As stated there are some fringe situations but that always applies in gaming IME. Even my old Ivy bridge Xeon can drive a RTX 2080 Ti at 4K60hz and keep up with even then newest CPUs. And should I choose to render at 1440/1080P it is still competitve with 1st/2nd gen ryzen. My IVB Xeon is only a bottleneck at high refresh rates but then your also not talking about sub 60fps stuttering trash at that point either. If you are stuttering at 100+ FPS something else is more likely the root of the issue like bad drivers/needing to run something like DDU, turning on vsync/gsync, its time for a fresh win 10 install or updating your bios. So I'll have to strongly disagree with you on the CPU more likely being the root cause of such behavior. Unless you see 85% + CPU utlization (either per core or as a CPU as a whole) at which point it actually does show a bottleneck on the CPU. Otherwise it's just not how rendering typically works.
 
Last edited:
  • Like
Reactions: Shadowclash10

jakeallen58

Commendable
Apr 30, 2018
25
1
1,545
The increase is nice, but using an example with a jump from 90 fps to 190 fps isn't really a difference the human eye can detect.

Here's the way I'm looking at it, and I disagree a little bit. It's pretty noticeable from 90 to 190 imo. Nonetheless;

I currently have a 1080FE. While my 1080 can run pretty much anything very smoothly, its consistently running at 82C, so its a bit of a toasty card. The 3080 is much more robust, can handle a lot more workload, and do the same thing as my 1080 with significantly less resources. All while staying a whole heck of a lot cooler. The new 3000 series is quite revolutionary, I think Nvidia could have marked up their cost significantly with how these new cards are performing, but they're very affordable given what they're capable of. So in my opinion, these new cards are extremely worth the upgrade.
 

Shadowclash10

Prominent
May 3, 2020
184
46
610
Not really, not on the CPU side. Games where the CPU is the bottleneck and not the GPU, are by far in the minority by a good margin. When you say things like stuttering in the first sentence (i see you added a more accurate qualifier later), which implies a sub 60fps frame rate, whether it's average's or 1% lows, unless your CPU is either truly ancient, seriously thread starved or worse you playing poorly optimized game that also happens to be a thread hog like COD Warzone on a very old CPU, what your saying generally just isn't the case. As stated there are some fringe situations but that always applies in gaming IME. Even my old Ivy bridge Xeon can drive a RTX 2080 Ti at 4K60hz and keep up with even then newest CPUs. And should I choose to render at 1440/1080P it is still competitve with 1st/2nd gen ryzen. My IVB Xeon is only a bottleneck at high refresh rates but then your also not talking about sub 60fps stuttering trash at that point either. If you are stuttering at 100+ FPS something else is more likely the root of the issue like bad drivers/needing to run something like DDU, turning on vsync/gsync, its time for a fresh win 10 install or updating your bios. So I'll have to strongly disagree with you on the CPU more likely being the root cause of such behavior. Unless you see 85% + CPU utlization (either per core or as a CPU as a whole) at which point it actually does show a bottleneck on the CPU. Otherwise it's just not how rendering typically works.
Yeah, this is right. CPU bottlenecked games are quite rare - esp compared to games where your GPU is the bottleneck. We have plenty of people with >=GTX 900s but paired with CPUs from the Intel 2000 series.
 
  • Like
Reactions: atomicWAR
I currently have a 1080FE. While my 1080 can run pretty much anything very smoothly, its consistently running at 82C, so its a bit of a toasty card. The 3080 is much more robust, can handle a lot more workload, and do the same thing as my 1080 with significantly less resources. All while staying a whole heck of a lot cooler.
I have some doubts that a 320 watt RTX 3080 is going to run cooler than a 180 watt GTX 1080 FE. The better cooler might be able to keep the card's temperatures in check, but it's undoubtedly going to be pumping out a lot more heat under heavy load.

For a few cut-scenes, Nvidia shows a direct comparison of the RTX 2080 Ti to the RTX 3080. The RTX 2080 Ti was averaging around 80-90 fps while the RTX 3080 was in the mid-130 fps range for most of the scene. That's easily a 60% difference in performance.
I watched the video, pausing every second or so during the side-by-side comparison segments, and the 3080 was typically around 45% faster than the 2080 Ti. There was maybe a moment of two where the 3080 got near 60% faster, but there were also points where it was more like a 30% difference. Those are still good performance gains, but I don't see the point in exaggerating them.