Review Nvidia GeForce RTX 4080 Super review: Slightly faster than the 4080, but $200 cheaper

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
$1000 for 16 GB VRAM. What a ripoff. Personally the 7600 XT with 16 GB VRAM is the only GPU I'd consider.Nvidia has better performance but their greed is incredible.
I'll be using my 8 GB RX 570 til it's wheels fall off. Then I may simply be done with PC gaming. It's becoming ridiculous now.
Sounds like you are not the target market for this product considering you are on a 7 year old mid range gpu.
 
Yeah, this is the big issue, and likely a big part of why Nvidia isn't trying to push out higher VRAM capacities for the consumer market. Realistically, gaming isn't going to exceed 16GB of memory use on any reasonable title in the next few years — basically not until future consoles come out that have more than 16GB of memory. But the cost of putting 32GB instead of 16GB is probably $70, maybe $100 at most.
You hit the nail on the head here. People forget you can turn settings down and in 'reasonable' titles they are going to have settings that easily accommodate 16GB of VRAM for some time to come. BUT as you pointed out the cost of going 32GB is ~100USD at worst. And moves like this upset their user base and make the nick name Ngreedia more common as consumers do what they can to vent their frustrations at this behavior.

Look at how long many folks hold/held onto Nvidia's 1000 series cards. Many people only now upgrading from those cards will use these newest offerings for a long time to come (3ish gens). And many of those people are going to want the max textures anytime possible. As you know textures are the biggest offender in VRAM usage. And lets be real, it is also the easiest way to increase the appearance of your games without taxing your GPU much more. This is not to say more compute intensive tasks like various lighting techniques (ray-tracing being the most obvious), physics, post processing, AA, etc don't add a more refined visual experience but textures, which literally cover everything you look at in game, tend to jump out first and THEN you to notice those other more sublte effects. And those settings like ray-tracing, heavy AA, and high post-processing are going to be the first things many users turn down to achieve their desired frame rate as they net far greater gains than dropping texture quality, assuming you have the VRAM for them.

Nvidia adding a little more VRAM this gen would have shown they choose to treat their consumers to a better long term gaming experince and gained goodwill in the process. Especially with all the price increases this gen, Nvidia needed this type of gesture IMHO. Instead though they chose to skimp on VRAM throughout most of the stack yet again while raising pricing, limiting the normal performance gains in everything but the 4090.

Nvidia came out looking like they were trying to shaft their consumers to achieve a better bottom line with the 4000 series. It also left many wondering just how much of this skimpy VRAM is planned obsolescence. True or not it started a conversation among users, especially the 8/12GB cards, Nvidia could have done without.

Sadly Nvidia gets away with this behavior as many of their consumers either suffer from collective amnesia when they buy and/or play along because of Nvidia's mostly better implimented feature set compared to AMD and Intel, all while typically achieving faster frame rates make their offerings impossible to ignore, especially once raytracing becomes a serious part of the conversation...then it almost forces the choice for many gamers to go team green. Me included (not proud of it). I hope we see better from AMD and Intel both because if we don't, Nvidia has little reason to change their ways. Better competition could help solve a lot of these pricing/VRAM issues but AMD/Intel need to do better and consumers need to be more willing to buy cards not sold by a leather jacket enthusist who named his company after a vengeance demon.
 
Last edited:
Somewhat tempting (in a context of upgrading to 4K gaming). But then again, rumours on the street seem to have it, that RDNA 4 will come later this year with a sub-$600 GPU, which will have a (rasterization) performance almost like the RTX 4080 Super.

And such GPU will potentially already make use of PCIe 5.0 lanes, which would not be an issue for my MB (which did cost me a bit more, as Tom's Hardware keeps reminding when talking about AM5, but already supports enough PCIe 5.0 lanes also for NVMe SSD, while Intel's latest still goes only with 16 PCIe 5.0 lanes - so basically, money saved, as I don't need a second MB upgrade).

Not counting on the rumours. But I suppose by now I may as well hold out, and als check Ryzen 9xxx offerings, starting in April apparently.

Yeah, I can't complain neither. Plenty good for 1440p gaming, the 6700 XT is. And a bit more choice among 1440p monitors, for a not very expensive one with 120+ Hz.

The ray-tracing performance as such, that isn't great. But in games such as Metro Exodus, there is other things to focus on, than to be sightseeing all the time. :) In any case, if one isn't determined to go "4K gaming, and nothing else", then there are plenty of options for a rig (even if the top GPUs come with a high price point - but they are way over the top i.e. for 1080p gaming these days).
IMO ultrawide 1440p monitors are more useful overall than a standard 4k monitor. Every person I know that has gone to an ultrawide will not go back to a standard 16:9 or 16:10 monitor. The additional width is so nice in both productivity and gaming.
 
Great review, Jarred! No surprises here... well, I was betting on 4.3% more performance, not 3.5%, but I'm splitting hairs. : P There isn't much left to be said about the 4080 Super today than there was on January 8. The $200 price drop, making the 4080 Super "only" $1,000, is still absolutely bonkers.

When the Super cards were announced, I thought the 70 Ti Super would be the most interesting, and 80 Super would be the least interesting. I was right about the latter. But for me, the 4070 Super stole the show (and stole my heart, as it resides in my PC right now. I'm absolutely ecstatic about the improvement over my 2060. :))
 
No, that delta is by design, not real. It is designed to perform poorly on AMD hardware. since both CP 2077 and AW2 have equal penalty on 40 series, but one differ hugely on AMD hardware. Makes no sense.

It seems you didn't even bother to look at the 3rd image of benchmarks (if you even read this article at all) if you put RDNA 3 equal to RTX 20:
FQBU6RJRor6XMQ9CcCYgMQ-1200-80.png.webp

the $500 7800 XT has almost the same "RT performance" as $600 4070 though both are garbage. The only RT card that really exists is 4090 but its price makes it completely irrelevant for majority of gamers. RT has been used as an excuse for greedy animals at Nvidia to charge as much as they want. RT itself is total BS as shown game after game.
Look at my signature. Do I like like an nvidia fan? In 8 systems, I have ONE with a 3060 because of the streaming features for my wife when she teaches class.

But lets go with your argument: it would be so much better if the game was optimized for AMD
Your argument is irrelevent even IF those specific games are intentionally skewed towards a particular architecture. (This include Portal RT and minecraft RT) The point is, if you want to play these games with all the bells and whistles, you'll have to buy that NVIDIA architecture. No amount belly aching about how a game is optimized is going to change that. And until AMD can pull a trump out of their hat that shows really high end RT that competes with a 4080, then it really doesn't matter. To date AMD only gets close on low end RT.

Even I can admit the AMD series is trash when it comes to RT. The 40 series has a much greater availability of resources available. You have to remember RT is a series of seperate circuits. If you are running relatively light RT, then 7900 XTX will come much closer. But as soon as you start maxing things out, especially for things like Path Tracing, then the RDNA architecture is handicapped. And path tracing is a beautiful thing.

AMD is two generations behind. And that's a fact. And if Moores Law and RGT to be believed, not even RDNA 5 will catch todays 4090 in RT.

My conclusion is this: If you are going to spend MAJOR $$$$ on a high end card, choose a card that is more balanced in feature performance. I have no problem paying for a 4080S (which I'm getting) for a small raster loss if it means I can turn on eye candy for ALL games, especially at 1440p

If you want to race a Dodge Hell Cat (AMD) at the strip all day long, be my guest. I'll take something that is more balanced that can handle curves and drag. (NVIDIA)
 
Last edited:
IMO ultrawide 1440p monitors are more useful overall than a standard 4k monitor. Every person I know that has gone to an ultrawide will not go back to a standard 16:9 or 16:10 monitor. The additional width is so nice in both productivity and gaming.
As a ultrawide user I agree.

I saw the Tax for 4k gaming and it wasn't worth it. And even the mighty 4090 still struggles in some games at 4k.

For me personally I prefer 1440 UW at 100-144fps than 4k 60.

Higher FPS matter more to me on a 144hz display.
 
Last edited:
As I ultrawide user I agree.

I saw the Tax for 4k gaming and it wasn't worth it. And even the mighty 4090 still struggles in some games at 4k.

For me personally I prefer 1440 UW at 100-144fps than 4k 60.

Higher FPS matter more to me on a 144hz display.

I'm the same. 3440x1440 in a 34" was the sweet spot. I had a 40" 4K with decent specs that I tried as a monitor and the resolution didn't bring enough to the table to chase it any further given what I had to give up in frame performance. I preferred having the greater relative screen realestate and preferred the wider field of view presented. I can run most everything at a steady 100 fps+ with max settings on a 3080, so I haven't been able to justify upgrading the video.
 
the fastest Nvidia option that's still within reach of gamers
By what reasoning?!? It's literally the price of a PS5 + an okay work laptop. Can I afford it? Yes. Can I justify it it? No, no and no again.

I never thought I'd see the day that my other hobby (running a car for trackdays) is cheaper than gaming on PC. Full set of slicks + 3 trackday entries or a GPU? Hmmm, let me think about that real quick... PC Gaming does not exist in a vacuum, I'll happily become a full-time console peasant and go and do a whole bunch of other stuff with my money.
 
Take RT out of the picture, and the 7900xtx is still more powerful for $200 cheaper. Yes yes, its not $200 cheaper now, but when i bought mine on prime day it was $800 and i also get a premium copy of starfield
 
$1000 for 16 GB VRAM. What a ripoff. Personally the 7600 XT with 16 GB VRAM is the only GPU I'd consider.Nvidia has better performance but their greed is incredible.
I'll be using my 8 GB RX 570 til it's wheels fall off. Then I may simply be done with PC gaming. It's becoming ridiculous now.

The 7600xt rarely can make use of said vram. Comparing a lower range card to a nearly top of the line card doesn't make much sense, either. If you want a GPU that can make use of said vram, and not break the bank, the RX 6800, is your cheapest good option. The 7600xt loses to an RX 6700xt.
 
I don't know why you think a budget card should have that much RAM. You're not going to be gaming at resolutions where you can make use of those larger textures.
I already said it but I'll say it again: higher resolution TEXTURES deliver many of the most obvious image quality gains and that requires little to no additional GPU-power. In modern games, having only 8GB of VRAM often leads to very obvious asset pops, ex.: you run around a town, bang into a wall and it takes a second for the high-res wall texture to load, then you look away and it takes a second for the high-res environment textures and assets to reload too after getting evicted to load the high-res wall. You don't have those issues on 12GB GPU. At least not yet.

12GB is becoming the minimum viable amount even for entry-level gaming GPUs if you want to avoid VRAM-related annoyances.
 
I already said it but I'll say it again: higher resolution TEXTURES deliver many of the most obvious image quality gains and that requires little to no additional GPU-power. In modern games, having only 8GB of VRAM often leads to very obvious asset pops, ex.: you run around a town, bang into a wall and it takes a second for the high-res wall texture to load, then you look away and it takes a second for the high-res environment textures and assets to reload too after getting evicted to load the high-res wall. You don't have those issues on 12GB GPU. At least not yet.

12GB is becoming the minimum viable amount even for entry-level gaming GPUs if you want to avoid VRAM-related annoyances.
With that said though, it is annoying to have games that don't account for the fact that you have more VRAM than the game requires. there is nothing more that I hate when it comes to image quality in games than texture pop-in when my system is way overkill for said game, even at the highest in-game settings.
 
  • Like
Reactions: AgentBirdnest
You're not going to be gaming at resolutions where you can make use of those larger textures. I have a 1080Ti with 11GB (from the same timeframe) and the memory buffer isn't getting maxxed out at 3440x1440. Unless you're actually gaming at 4k or greater resolution, you're likely not running into a VRAM limitation
Just because you're not gaming at that resolution, doesn't others are not and 12 GB of VRAM is more than enough!!
 
  • Like
Reactions: Order 66
Nvidia: How do we turn a bad media image into a good one?
Problem: People don't like 4080 prices
Solution: Ignore the users complaints for two years. Stop making as many 4080 to make price match supply.
Wait 2 years and re release same card with a new label.
Profit?

At very least gets rid of more 4080 stock and shuts down some of the complaints
 
  • Like
Reactions: Order 66
If I've learned one thing over the last few years is that gaming at 4K is a really poor financial proposition (At other peoples expense thankfully). Go down to 1440p and the costs drop drastically for the same level of performance and pretty much the same visually (Viewing distances matter, of course). I saw someone mention that 4K doesn't bring enough to the table visually. Agreed. I'm sure there's use cases for 4K in productivity but for gaming it just seems to make little sense. This is probably why QHD is gaining so much traction these days.
 
If I've learned one thing over the last few years is that gaming at 4K is a really poor financial proposition (At other peoples expense thankfully). Go down to 1440p and the costs drop drastically for the same level of performance and pretty much the same visually (Viewing distances matter, of course). I saw someone mention that 4K doesn't bring enough to the table visually. Agreed. I'm sure there's use cases for 4K in productivity but for gaming it just seems to make little sense. This is probably why QHD is gaining so much traction these days.
i had a 4k screen 8 years ago and swapped to 2k 3 years ago... Most of the time I had 4k I couldn't run it at that resolution anyway, and windows defaults 4k screens to 150% scale so its essentially running as 2k anyway.
I swapped so I could run screen at native resolution and not have to deal with the few applications I had that would run their UI at native on a 4k screen... Logitech software was one, I couldn't read the interface.

Now my GPU matches my monitor and I am in no rush to change.
 
yes, about those resources:
rt-alan-wake-2-3840-2160.png

4070 Super loses 92% of its performance when you turn RT on. Where did dedicated "RT Hardware" go?!
AMD and Nvidia are BOTH bad at RT. Unless you pay $2K for 4090.
I mean, you clearly just cherry picked the one result from the TPU review RT section where the game happened to need >12GB of VRAM. The cards at the bottom of that chart are falling over due to running out of memory, not due to RT (other than the impact RT has on VRAM demand). The charts don't look like that for any other games/resolutions. But the cards with <16GB aren't powerful enough to run 4K RT anyway (at least for full path traced games like Alan Wake 2, or Cyberpunk 2077 with Overdrive mode), even if they had more VRAM.

https://www.techpowerup.com/review/nvidia-geforce-rtx-4080-super-founders-edition/35.html
 
Last edited:
shark
I mean, you clearly just cherry picked the one result from the TPU review RT section where the game happened to need >12GB of VRAM. The cards at the bottom of that chart are falling over due to running out of memory, not due to RT (other than the impact RT has on VRAM demand). The charts don't look like that for any other games/resolutions. But the cards with <16GB aren't powerful enough to run 4K full-RT anyway, even if they had more VRAM.
It's also very wildly different from Jarred's Alan Wake 2 chart at 4K with RT on.
I'd like to know what caused such massive differences between the TP and TH reviews.
(I'm not sure why I quoted you, specifically. Sorry. : P)

XWPMP5e3d8MC2QgnFeLZXD-970-80.png.webp