News Nvidia GeForce RTX 3080 Founders Edition Review: A Huge Generational Leap in Performance

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Impressive indeed. Looks like Jensen was telling the truth to us 1080Ti owners 🤣
They still lied about 2x performance, whatever that was supposed to mean.

I watched the Gamers' Nexus video, and for 1440p, I'm looking at a probable 70%-ish uplift on average - across their samples tested anyways.
Also what I gathered from that video: 3080 is dumb for 1080p. People are going to do it anyway, and complain about fps, only to see that one or more cpu threads are maxed out... :pfff:

I also watched other Steve's video, and his testing landed closer to 60% overall - compared to the 1080Ti - still impressive nonetheless.

I'm in no rush to go and grab one - there's still the RX 6000 series, plus I have to wait for Alphacool's cooling solutions for these cards anyway.

They wrote in "Up To 2x" That "Up To" is the * in the disclaimer.
 
This is very impressive in regards to price/performance.

But honestly given the massive power draw and die size not as impressive as I thought it would be.
Nvidia just made everything bigger and more hungry.
I'll admit that the extra power draw is something of a let-down, true.

Hmm, thinking about it: the 2080FE's 218.6W draw vs the 3080FE's 332.8W means we're looking at a 52.2% increase in power consumption, using Metro Exodus, to get a 58.7% increase in performance. That's using the 2560x1440 Ultra settings, since there's a benchmark for that, AND it's what is used for the power draw calculations.

It's a SLIGHT performance/watt increase. Not much of one, but it's something. I suspect that the 3070 vs 2070 will show a greater improvement in performance/watt increase since the Ampere isn't going to be pushed as hard, and higher clocks for a given architecture seem to have a tendency to greatly diminish the fps/watt ratio. Think, for example, of the Vega 56 - pretty badly power hungry because it was clocked up to beat the GTX 1070, but when the Vega 56 is underclocked/undervolted to approximately GTX 1070 performance, it got VERY close to the GTX 1070's fps/watt ratio.


I could be sort of guessing in the wrong direction, though.
 
Last edited:
  • Like
Reactions: JarredWaltonGPU
Phew! After having gone through 5 total reviews, I have reached the following:
A)Impressive 1440p gains. Worthwhile upgrade from a 1080Ti.

B)4K is where this card really shines though, due to the changes Nvidia made to FP32.

C)1080p: Ya'll just need to turn right around and:
-wait for the 3070
-wait for RX 6000
-move up to 1440p or 4K
-get faster cpu/ram if you don't already have something like a 10-6 or 3700X
Because it's just cpu limitation city, like the 2080Ti wasn't already doing that...
Wow, and some people were actually looking at a 3090 for this, ROFL!

D)Some of us already called it: PCIe Gen 4 didn't do JACK, so folks that were up in arms about it, 🤐

E)If you're on, or HAD - he he, panic selling - a 2080Ti, you're not missing much.

F)The 'lack' of Vram appears to be unwarranted, because memory allocation; what a game 'wants', isn't what it actually uses.

G)RT took a bit of a back seat, and it needed to. Most of the owners of RTX cards aren't even using it regularly, and it should be a given:
-the performance hit is still too significant
-support is still not in a great spot

H)The FE cooler is darn good.
It's doing what AIB models already do; dump heat inside the chassis - BUT, it appears to be a little more efficient. So the concerns about 'warming up the cpu cooler' were unwarranted.
The AIBs have their work cut out for them; they'll be running even higher power limits on mostly the same cooling solutions as before...
 
Quick skim through and it looks like it's going to be excellent.

I was hoping for 1440p @ max settings and to be able to sustain 144FPS in basically every game and it looks like that's ticked!

It's more than double the performance of my 1080, so I will be buying one of these and the rest of my computer is ready for an upgrade too. Hopefully Ryzen 3 overtakes Intel for gaming next month... better get my finances ready!
 
They wrote in "Up To 2x" That "Up To" is the * in the disclaimer.
B-but:
666.png

What's with that best case scenario crap?
 
What about VR? From what I understand performance increases based on resolution do not directly translate to VR performance increases. Perhaps this card has more of an impact even at lower resolutions when used with VR?
 
What about VR? From what I understand performance increases based on resolution do not directly translate to VR performance increases. Perhaps this card has more of an impact even at lower resolutions when used with VR?
What? VR drives just as many pixels as 4K, so the 'lower resolutions when used with VR' makes no sense.
"The Vive and the Rift both feature two 1080 x 1200 resolution screens, but after calculating for things like eye buffer and lens refraction they come to a combined resolution of 3024 x 1680. Add in the fact that each screen features a 90Hz refresh rate, and at optimal frame rates the Vive and Rift require your rig to process and render a combined 457 million pixels per second. In contrast, a standard 4K screen running at 60fps comes out to a ballpark of 498 million pixels per second."
 
  • Like
Reactions: Shadowclash10
A couple sites tested 3x vs 4x and found 4x to be about 1% faster. Only problem was that the test had to be done on an AMD platform and the same sites found the AMD side to be 10% slower than Intel. People need to stop worrying about PCI 3.0 being a bottleneck. Here's one of them.

https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-pci-express-scaling/27.html
Weird we have to go to another site to learn about something Tom's should have had in their review. Thanks for sharing though.
 
Last edited:
Yeah, I get that ... but that's less a GPU review topic and more a display review topic. The experience for all of these things is going to vary quite a bit, depending on what display you're using.

I understand but most people game on HDTVs...not monitors. I know very few people that even own high powered gaming PCs. They own consoles. These GPUs along with the upcoming consoles should definitely focus on HDTV compatibility as well as it's a huge selling point. Who will review the consoles when they drop? And to be honest the next gen consoles are the 3080/3090 biggest competitors hence their price points...not AMD. Even IGN has 3080 review up. Gaming is gaming and HDTVs are a HUGE part of that equation.
 
Wrong. Having had a 2080S and gaming at 1440p 144Hz I was underwhelmed by its performance. The 3080 is looking like a good upgrade.

I totally agree being a current S3220DGF and future HDMI 2.1 HDTV owner. A 3080 would consistently give 120fps and above on almost everything at 1440p
 
Last edited:
Yeah, I get that ... but that's less a GPU review topic and more a display review topic. The experience for all of these things is going to vary quite a bit, depending on what display you're using.

Also Jarred thank you for this comprehensive review. I've been reading quite a few articles this AM ( I'm supposed to be working lol) but this one has been the best. I'm certainly looking forward to seeing the best performing 3rd party cards...especially the cards from Asus and EVGA
 
I'll admit that the extra power draw is something of a let-down, true.

Hmm, thinking about it: the 2080FE's 218.6W draw vs the 3080FE's 332.8W means we're looking at a 52.2% increase in power consumption, using Metro Exodus, to get a 58.7% increase in performance. That's using the 2560x1440 Ultra settings, since there's a benchmark for that, AND it's what is used for the power draw calculations.

It's a SLIGHT performance/watt increase. Not much of one, but it's something. I suspect that the 3070 vs 2070 will show a greater improvement in performance/watt increase since the Ampere isn't going to be pushed as hard, and higher clocks for a given architecture seem to have a tendency to greatly diminish the fps/watt ratio. Think, for example, of the Vega 56 - pretty badly power hungry because it was clocked up to beat the GTX 1070, but when the Vega 56 is underclocked/undervolted to approximately GTX 1070 performance, it got VERY close to the GTX 1070's fps/watt ratio.


I could be sort of guessing in the wrong direction, though.
There is something I've noticed with my 2070 Super in that in some games, it'll happily gobble up all the power it can even if there aren't any performance gains. I've first noticed this in FFXIV, where I can maintain the same frame rate at 75% power as I get at 100% power, and the card will still happily go up to 100% if I left it there.

This is reminding me as well of the 2700X system I had. If I set the PPT to 85W, I could get better overall performance than if I left it at the default (I think it's 115W) simply because the darned thing wasn't trying pump so much voltage down it at higher clock speeds.

If I get a 3080, I'll look into tweaking the power profile. We may have a case here of pumping too much of something to guarantee it works at spec.
 
  • Like
Reactions: King_V