I'm pretty sure if you multiplied and added some combination of FLOPs and OPS and whatever, you'll get that magic 2x number 🤣They still lied about 2x performance, whatever that was supposed to mean.
Impressive indeed. Looks like Jensen was telling the truth to us 1080Ti owners 🤣
They still lied about 2x performance, whatever that was supposed to mean.
I watched the Gamers' Nexus video, and for 1440p, I'm looking at a probable 70%-ish uplift on average - across their samples tested anyways.
Also what I gathered from that video: 3080 is dumb for 1080p. People are going to do it anyway, and complain about fps, only to see that one or more cpu threads are maxed out...
I also watched other Steve's video, and his testing landed closer to 60% overall - compared to the 1080Ti - still impressive nonetheless.
I'm in no rush to go and grab one - there's still the RX 6000 series, plus I have to wait for Alphacool's cooling solutions for these cards anyway.
Yup. Technically, I saw at least one game (Doom Eternal at 4K ultra nightmare) that was over twice as fast. But most of the time, it was closer to 70% faster. It's also about twice as fast in GPU compute so far (not shown in the article yet -- more testing to do!)They wrote in "Up To 2x" That "Up To" is the * in the disclaimer.
I'll admit that the extra power draw is something of a let-down, true.This is very impressive in regards to price/performance.
But honestly given the massive power draw and die size not as impressive as I thought it would be.
Nvidia just made everything bigger and more hungry.
All of the GPUs were tested with HW Scheduling disabled, except for the line that says "HWSched" for the 3080. It's a bit faster in some games, a bit slower in others, and mostly a net wash (less than 1% difference overall).Were the benchmarks done with hardware accelerated GPU scheduling on or off?
A couple sites tested 3x vs 4x and found 4x to be about 1% faster. Only problem was that the test had to be done on an AMD platform and the same sites found the AMD side to be 10% slower than Intel. People need to stop worrying about PCI 3.0 being a bottleneck. Here's one of them.What about PCIe gen4 vs others ?
What about PCIe gen4 vs others ?
B-but:They wrote in "Up To 2x" That "Up To" is the * in the disclaimer.
What? VR drives just as many pixels as 4K, so the 'lower resolutions when used with VR' makes no sense.What about VR? From what I understand performance increases based on resolution do not directly translate to VR performance increases. Perhaps this card has more of an impact even at lower resolutions when used with VR?
Weird we have to go to another site to learn about something Tom's should have had in their review. Thanks for sharing though.A couple sites tested 3x vs 4x and found 4x to be about 1% faster. Only problem was that the test had to be done on an AMD platform and the same sites found the AMD side to be 10% slower than Intel. People need to stop worrying about PCI 3.0 being a bottleneck. Here's one of them.
https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-pci-express-scaling/27.html
Wrong. Having had a 2080S and gaming at 1440p 144Hz I was underwhelmed by its performance. The 3080 is looking like a good upgrade.For 4k gaming... yeah, nice card.
For under 4k gaming... no point at all. Save your cash and buy a second hand/reduced 2080/2080ti.
Yeah, I get that ... but that's less a GPU review topic and more a display review topic. The experience for all of these things is going to vary quite a bit, depending on what display you're using.
Weird we have to go to another site to learn about something Tom's should have had in their review. Thanks for for sharing though.
Wrong. Having had a 2080S and gaming at 1440p 144Hz I was underwhelmed by its performance. The 3080 is looking like a good upgrade.
Yeah, I get that ... but that's less a GPU review topic and more a display review topic. The experience for all of these things is going to vary quite a bit, depending on what display you're using.
There is something I've noticed with my 2070 Super in that in some games, it'll happily gobble up all the power it can even if there aren't any performance gains. I've first noticed this in FFXIV, where I can maintain the same frame rate at 75% power as I get at 100% power, and the card will still happily go up to 100% if I left it there.I'll admit that the extra power draw is something of a let-down, true.
Hmm, thinking about it: the 2080FE's 218.6W draw vs the 3080FE's 332.8W means we're looking at a 52.2% increase in power consumption, using Metro Exodus, to get a 58.7% increase in performance. That's using the 2560x1440 Ultra settings, since there's a benchmark for that, AND it's what is used for the power draw calculations.
It's a SLIGHT performance/watt increase. Not much of one, but it's something. I suspect that the 3070 vs 2070 will show a greater improvement in performance/watt increase since the Ampere isn't going to be pushed as hard, and higher clocks for a given architecture seem to have a tendency to greatly diminish the fps/watt ratio. Think, for example, of the Vega 56 - pretty badly power hungry because it was clocked up to beat the GTX 1070, but when the Vega 56 is underclocked/undervolted to approximately GTX 1070 performance, it got VERY close to the GTX 1070's fps/watt ratio.
I could be sort of guessing in the wrong direction, though.