Review Nvidia GeForce RTX 4080 Super review: Slightly faster than the 4080, but $200 cheaper

Lamarr the Strelok

Commendable
Sep 29, 2022
61
22
1,535
$1000 for 16 GB VRAM. What a ripoff. Personally the 7600 XT with 16 GB VRAM is the only GPU I'd consider.Nvidia has better performance but their greed is incredible.
I'll be using my 8 GB RX 570 til it's wheels fall off. Then I may simply be done with PC gaming. It's becoming ridiculous now.
 

usertests

Distinguished
Mar 8, 2013
966
855
19,760
$1000 for 16 GB VRAM. What a ripoff. Personally the 7600 XT with 16 GB VRAM is the only GPU I'd consider.Nvidia has better performance but their greed is incredible.
I'll be using my 8 GB RX 570 til it's wheels fall off. Then I may simply be done with PC gaming. It's becoming ridiculous now.
I'm not going to tell you to continue PC gaming but there are plenty of options that are good enough for whatever you're doing, like an RX 6600. If you want more VRAM, grab a 6700 XT instead of 7600 XT, or an RTX 3060, while supplies last. Then if we later see the RX 7600 8GB migrate down to $200, and 7700 XT 12GB down to $350, those will be perfectly fine cards.

By the time you're done hodling your RX 570, the 7600 XT should be under $300 and at least RDNA4 and Blackwell GPUs will be out.
 

RandomWan

Prominent
Sep 22, 2022
59
65
610
$1000 for 16 GB VRAM. What a ripoff. Personally the 7600 XT with 16 GB VRAM is the only GPU I'd consider.Nvidia has better performance but their greed is incredible.
I'll be using my 8 GB RX 570 til it's wheels fall off. Then I may simply be done with PC gaming. It's becoming ridiculous now.

You're complaining about the VRAM (which doesn't matter as much as you think) and the price when you're sporting a bottom budget card. There's any number of cards you could upgrade to with a $300 budget that will blow that 570 out of the water.

These should be over 2x the performance of your card with 16GB for $330:

https://pcpartpicker.com/product/vT...deon-rx-7600-xt-16-gb-video-card-rx-76tswftfp

https://pcpartpicker.com/product/sq...00-xt-16-gb-video-card-gv-r76xtgaming-oc-16gd
 

Gururu

Prominent
Jan 4, 2024
311
215
570
I thought healthy competition between companies meant the customer wins. This proves not the case. They do just enough to edge the competition when they could do soooo much more for the customer.
 
  • Like
Reactions: peachpuff

InvalidError

Titan
Moderator
You're complaining about the VRAM (which doesn't matter as much as you think) and the price when you're sporting a bottom budget card.
If nobody complains about ludicrously expensive GPUs having a bunch of corners cut off everywhere to pinch a few dollars on manufacturing off a $1000 luxury product, that is only an invitation to do even worse next time. No GPU over $250 should have less than 12GB of VRAM, which makes 16GB at $1000 look pathetic.

Also, having 12+GB does matter as higher resolution textures are usually the most obvious image quality improvement with little to no impact on frame rate as long as you have sufficient VRAM to spare and 8GB is starting to cause lots of visible LoD asset pops in modern titles.

I thought healthy competition between companies meant the customer wins. This proves not the case. They do just enough to edge the competition when they could do soooo much more for the customer.
Corporations' highest priority customers are the shareholders and shareholders want infinite 40% YoY growth with the least benefits possible to the retail end-users as giving end-users too much value for their money would mean hitting the end of the road for what can be cost-effectively delivered that much sooner and be able to milk customers for that many fewer product cycles.
 
I thought healthy competition between companies meant the customer wins. This proves not the case. They do just enough to edge the competition when they could do soooo much more for the customer.
The last time we had healthy competition in anything computer related was in the 90ies.
AMD buying ATI in 2006 was the last of any "healthy" competition, every other GPU company at that point was already defeated, also every other CPU company other that intel and AMD with ARM, as a company, barely hanging on even though ARM as CPUs are almost everywhere.
 

RandomWan

Prominent
Sep 22, 2022
59
65
610
If nobody complains about ludicrously expensive GPUs having a bunch of corners cut off everywhere to pinch a few dollars on manufacturing off a $1000 luxury product, that is only an invitation to do even worse next time. No GPU over $250 should have less than 12GB of VRAM, which makes 16GB at $1000 look pathetic.

It carries a bit less weight complaing about it when you're rocking what was a sub $200 video card. There's things other than a reasonable price keeping you from the card. By all means complain where appropriate, but unless people stop buying it, your complaints will acheive nothing.

I don't know why you think a budget card should have that much RAM. You're not going to be gaming at resolutions where you can make use of those larger textures. I have a 1080Ti with 11GB (from the same timeframe) and the memory buffer isn't getting maxxed out at 3440x1440. Unless you're actually gaming at 4k or greater resolution, you're likely not running into a VRAM limitation, especially if you're making use of upscaling technologies.
 

Lamarr the Strelok

Commendable
Sep 29, 2022
61
22
1,535
Well shadow of tomb raider at 1080p gets close to using 8 GB of VRAM. Far Cry 6 at 1440 uses close to 8 also.
I admit I'm a budget gamer.(I have guitars and guitar amps to feed).But yes, UE 5 is a bit of a pig.Many UE5 games have an rx570,580, 590 as the minimum so the party's over for me soon.
 
As long as Nvidia makes a killing on AI, they're going to reserve the fat chips like the 4090 only for the highest priced products. They're allocating most of the large chips to AI, hence why the 4090 at MSRP sold out in minutes yesterday. This 4080 Super really is what the 4070 Ti should've been.
Yeah, this is the big issue, and likely a big part of why Nvidia isn't trying to push out higher VRAM capacities for the consumer market. Realistically, gaming isn't going to exceed 16GB of memory use on any reasonable title in the next few years — basically not until future consoles come out that have more than 16GB of memory. But the cost of putting 32GB instead of 16GB is probably $70, maybe $100 at most.

Nvidia does have such cards. They're called RTX 5000 Ada, and use AD102 instead of AD103. The cost starts at $4,000. Or alternatively, there's RTX 4500 Ada with AD104 and 24GB, and it 'only' costs $2,250. But it's technically quite a bit slower than the 4080 Super, unless you're only after something with more than 16GB.

One thing I find very interesting right now is that Nvidia doesn't offer a professional "RTX xxxx Ada Generation" card that uses AD103. I'm not sure why that is, but certainly there's room for something between the 4500 and 5000 cards.
 

edzieba

Distinguished
Jul 13, 2016
588
589
19,760
Remember: render resolution alone has naff-all effect on vRAM requirements. Going from a set of 6x 32bpp 1920x11080 framebuffers to 6x 32bpp 3840x2160 framebuffers eats... 50MB vs. 200Mb vRAM.

It's textures that take up the vast majority of vRAM. A game running at 640x480 and forced to the OMGWTFBBQhigh texture scaling setting will eat more vRAM than a game rendered at UHD with regular textures. On top of that, DirectStorage and other media streaming mechanisms further alleviate the problem, as the cost to stream textures in when needed is reduced.

It doesn't help the looking at a single 'vRAM usage' meter tells you very little about actual vRAM requirements for a game. Game engines can and will aggressively cache every texture that have available to them into vRAM, either up until they hit capacity or until they run out of textures to load. There is no penalty for loading a texture into vRAM and then never using it until that cache is overwritten with data that is needed, because there is no performance difference to overwrite cache vs. overwriting 0-bit areas of vRAM. You can easily end up wit ha game that 'uses' 15GB of vRAM, but has no performance difference between the same game 'limited' to 8GB because 10GBof that vRAM 'usage' was opportunistic caching of textures that never made it on-screen.
 
Well shadow of tomb raider at 1080p gets close to using 8 GB of VRAM. Far Cry 6 at 1440 uses close to 8 also.
I admit I'm a budget gamer.(I have guitars and guitar amps to feed).But yes, UE 5 is a bit of a pig.Many UE5 games have an rx570,580, 590 as the minimum so the party's over for me soon.
At $330 the RX6700XT ASRock Challenger (12GB VRAM) is a great value. It has performance about equal to the 4060Ti 16GB that starts $100 more expensive. I got one last year and I get 85 FPS in Red Dead Redemption 2 (no FSR 2) at 3440x1440 resolution on highest settings. Borderland 3 I get 82 FPS at the same settings and resolution without using FSR. Note that the 7600XT costs the same as the 6700XT. Only advantage the 7600XT has is 4GB more VRAM but it is still 10% slower on average than the 6700XT.
 

mhmarefat

Distinguished
Jun 9, 2013
67
77
18,610
Hello and thank you for this review.
Can I ask your opinion about the future GPU prices if 7900 XTX level of performance for half the price can be reached? (https://www.techpowerup.com/318426/...performance-at-half-its-price-and-lower-power)

Alan Wake 2 shows how future path traced games might perform — the gap between Nvidia and AMD remains massive, though now the 4080 Super is 'only' 164% faster
Isn't this "gap" software related though? Like Starfield when Nvidia GPUs "all of a sudden" performed poorer? How come performance gap between AMD and Nvidia in Cyberpunk 2077 path tracing is much smaller? What happened?

g95Nnm6K6aRAkMBy3NFZPD-1200-80.png.webp


PGbuChgQaoR9PMkwNnhypV-1200-80.png.webp


Is this a real "gap"? Or maybe Alan Wake 2 with path tracing could run much faster on AMD if game developers wanted to? If so is this game even worth of including in benchmarks?

Is the future of PC gaming HARDWARE is to be at the mercy of every single piece of proprietary SOFTWARE to be able to function?! (DLSS + FG + RR + ... ?)
 
  • Like
Reactions: oofdragon
As long as Nvidia makes a killing on AI, they're going to reserve the fat chips like the 4090 only for the highest priced products. They're allocating most of the large chips to AI, hence why the 4090 at MSRP sold out in minutes yesterday. This 4080 Super really is what the 4070 Ti should've been.
I predict that soon you will have GPUs custom tailored for game engines. UE5 can bring down the 4090 as it is today. UE5 can experience significant uplifts if less operations were executing on the CPU even though it's a binary tree traversal log2 ( n ). Triangle rejection on nanite eats a tremendous amount of CPU time. Fortunately, Lumen already has branches for using GPU hardware raytracing.
 
Hello and thank you for this review.
Can I ask your opinion about the future GPU prices if 7900 XTX level of performance for half the price can be reached? (https://www.techpowerup.com/318426/...performance-at-half-its-price-and-lower-power)


Isn't this "gap" software related though? Like Starfield when Nvidia GPUs "all of a sudden" performed poorer? How come performance gap between AMD and Nvidia in Cyberpunk 2077 path tracing is much smaller? What happened?

g95Nnm6K6aRAkMBy3NFZPD-1200-80.png.webp


PGbuChgQaoR9PMkwNnhypV-1200-80.png.webp


Is this a real "gap"? Or maybe Alan Wake 2 with path tracing could run much faster on AMD if game developers wanted to? If so is this game even worth of including in benchmarks?

Is the future of PC gaming HARDWARE is to be at the mercy of every single piece of proprietary SOFTWARE to be able to function?! (DLSS + FG + RR + ... ?)
Yes that delta is quite real. The NVIDIA series stomps all over AMD when it comes to Ray Tracing. It is indeed over twice as fast, and the 7000 series (RDNA 3) only had minor improvements from the 6000 series (RDNA 2) Basically when it comes to ray tracing RDNA3 ~= 2080ti. Yes AMD has more memory, and slightly faster rasterization at similar price points. But to be 2 generations behind the competition when ray tracing is taking off is not acceptable IMHO. The card is no longer balanced in performance features.
 
  • Like
Reactions: 35below0
Hello and thank you for this review.
Can I ask your opinion about the future GPU prices if 7900 XTX level of performance for half the price can be reached? (https://www.techpowerup.com/318426/...performance-at-half-its-price-and-lower-power)


Isn't this "gap" software related though? Like Starfield when Nvidia GPUs "all of a sudden" performed poorer? How come performance gap between AMD and Nvidia in Cyberpunk 2077 path tracing is much smaller? What happened?

g95Nnm6K6aRAkMBy3NFZPD-1200-80.png.webp


PGbuChgQaoR9PMkwNnhypV-1200-80.png.webp


Is this a real "gap"? Or maybe Alan Wake 2 with path tracing could run much faster on AMD if game developers wanted to? If so is this game even worth of including in benchmarks?

Is the future of PC gaming HARDWARE is to be at the mercy of every single piece of proprietary SOFTWARE to be able to function?! (DLSS + FG + RR + ... ?)
In Cyberpunk 2077, RT Ultra isn't path tracing is the problem. RT Overdrive is path tracing. I don't know that I've looked at it more recently, but when it first came out, I ran these benchmarks:

uYzCuMbiQJjQvKwazFDA8Z.png

I *think* AMD has improved a bit since then in CP77, but the 7900 XTX is still about a third of the 4090 performance for full path tracing in games like CP77 and AW2.
 
  • Like
Reactions: digitalgriffin

mhmarefat

Distinguished
Jun 9, 2013
67
77
18,610
Yes that delta is quite real. The NVIDIA series stomps all over AMD when it comes to Ray Tracing. It is indeed over twice as fast, and the 7000 series (RDNA 3) only had minor improvements from the 6000 series (RDNA 2) Basically when it comes to ray tracing RDNA3 ~= 2080ti. Yes AMD has more memory, and slightly faster rasterization at similar price points. But to be 2 generations behind the competition when ray tracing is taking off is not acceptable IMHO. The card is no longer balanced in performance features.
No, that delta is by design, not real. It is designed to perform poorly on AMD hardware. since both CP 2077 and AW2 have equal penalty on 40 series, but one differ hugely on AMD hardware. Makes no sense.

It seems you didn't even bother to look at the 3rd image of benchmarks (if you even read this article at all) if you put RDNA 3 equal to RTX 20:
FQBU6RJRor6XMQ9CcCYgMQ-1200-80.png.webp

the $500 7800 XT has almost the same "RT performance" as $600 4070 though both are garbage. The only RT card that really exists is 4090 but its price makes it completely irrelevant for majority of gamers. RT has been used as an excuse for greedy animals at Nvidia to charge as much as they want. RT itself is total BS as shown game after game.
 
  • Like
Reactions: PEnns

DavidLejdar

Respectable
Sep 11, 2022
286
179
1,860
Somewhat tempting (in a context of upgrading to 4K gaming). But then again, rumours on the street seem to have it, that RDNA 4 will come later this year with a sub-$600 GPU, which will have a (rasterization) performance almost like the RTX 4080 Super.

And such GPU will potentially already make use of PCIe 5.0 lanes, which would not be an issue for my MB (which did cost me a bit more, as Tom's Hardware keeps reminding when talking about AM5, but already supports enough PCIe 5.0 lanes also for NVMe SSD, while Intel's latest still goes only with 16 PCIe 5.0 lanes - so basically, money saved, as I don't need a second MB upgrade).

Not counting on the rumours. But I suppose by now I may as well hold out, and als check Ryzen 9xxx offerings, starting in April apparently.
At $330 the RX6700XT ASRock Challenger (12GB VRAM) is a great value. It has performance about equal to the 4060Ti 16GB that starts $100 more expensive. I got one last year and I get 85 FPS in Red Dead Redemption 2 (no FSR 2) at 3440x1440 resolution on highest settings. Borderland 3 I get 82 FPS at the same settings and resolution without using FSR. Note that the 7600XT costs the same as the 6700XT. Only advantage the 7600XT has is 4GB more VRAM but it is still 10% slower on average than the 6700XT.
Yeah, I can't complain neither. Plenty good for 1440p gaming, the 6700 XT is. And a bit more choice among 1440p monitors, for a not very expensive one with 120+ Hz.

The ray-tracing performance as such, that isn't great. But in games such as Metro Exodus, there is other things to focus on, than to be sightseeing all the time. :) In any case, if one isn't determined to go "4K gaming, and nothing else", then there are plenty of options for a rig (even if the top GPUs come with a high price point - but they are way over the top i.e. for 1080p gaming these days).