Review Nvidia GeForce RTX 4080 Super review: Slightly faster than the 4080, but $200 cheaper

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Nvidia: How do we turn a bad media image into a good one?
Problem: People don't like 4080 prices
Solution: Ignore the users complaints for two years. Stop making as many 4080 to make price match supply.
Wait 2 years and re release same card with a new label.
Profit?

At very least gets rid of more 4080 stock and shuts down some of the complaints
This is hilarious. They might as well have sold an Aldi version.
 
I'm sure there's use cases for 4K in productivity but for gaming it just seems to make little sense. This is probably why QHD is gaining so much traction these days.
I find that ultrawide 1440p is more useful in productivity work than just a 4k. I use mine for gaming and work. It is basically like having a dual monitor setup without the bezel. My sister just started working from home and asked me what she should get for a monitor. I immediately suggested the ultrawide and so far she has been quite happy with it.
 
  • Like
Reactions: Order 66
TH neglected to include the 3080 in their tests, so I'll include two charts from TPU's tests which do include it, though as expected the 4080S falls right in line with the 4080 in terms of average performance with and without ray tracing, and are 50% faster than the 3080. While 50% performance gains are quite nice, it allows 4K75 without "AI" tricks on average (again, like the regular 4080), I'd say you were too generous to give it 3.5 stars considering the previous generation 3080 MSRPd for $799 (something not listed as a con that it MSRPs for $200 more than the 3080, other sites did), and that some models, like the TPU tested MSI 4080S Expert goes for $1150, and go 1.5-2 stars instead due to price, lack of VRAM upgrade for four figure price, and lack of performance increase despite being a "super" variant.

average-fps-3840-2160.png

relative-performance-rt-3840-2160.png


I mean I think we all know WHY nVidia didn't put 20-24GB VRAM on it, product segmentation, if you need the VRAM for professional use why buy a much more expensive 4090 when a 4080S would suffice, but it still, $1150 (or even $1000) for 16GB VRAM is just an insult (not saying it's needed NOW, but how well it hold up in 2 years?)
 
  • Like
Reactions: Order 66
shark

It's also very wildly different from Jarred's Alan Wake 2 chart at 4K with RT on.
I'd like to know what caused such massive differences between the TP and TH reviews.
(I'm not sure why I quoted you, specifically. Sorry. : P)

XWPMP5e3d8MC2QgnFeLZXD-970-80.png.webp
I used DLSS / FSR Quality mode upscaling, so my "4K" chart is really 1440p upscaled, and my 1080p chart is 720p upscaled. I figured if anyone is even going to attempt to play Alan Wake 2 at 4K with path tracing, they will absolutely have to turn on upscaling.
 
The only RT card that really exists is 4090 but its price makes it completely irrelevant for majority of gamers. RT has been used as an excuse for greedy animals at Nvidia to charge as much as they want. RT itself is total BS as shown game after game.

Is that what you said when gaming transitioned from 2D to 3D? Or when anti-aliasing was introduced? Or when we got shaders? HDR rendering? Tesselation? Subsurface scattering?

If the rapid evolution of PC gaming graphics disturbs you so much, why not just buy a console and be done with it? You can get a PS5, Xbox Series X, or Switch for far less than an RTX 4090. People who spent that kind of money obviously want to see a return on their investment. Besides, nobody really needs an RTX 4090. Cyberpunk 2077 and Alan Wake II, will run on weaker GPUs if you're willing to lower the graphics. And if you're patient enough, they will run in Path Tracing at 4k at acceptable performance on your budget RTX 7050 when it releases in ~5 years.

What are you complaining about? That you don't get to enjoy new technologies that improve graphics quality without having to spend money on a new GPU? Welcome to the real world.
 
I said I'd buy the 4080 non Super for $750 or so. I think I'd pay around ~$750 for this Super variant too. As that is not happening it appears I am once again holding off on my upgrade for now.
 
Appreciate the review even though the graphics market has been a downer lately. Would love to see some tier comparisons going back to 20 series just to see how things stand with today's drivers and platforms, but I get that's a pretty lengthy endeavor.
4070 Super loses 92% of its performance when you turn RT on. Where did dedicated "RT Hardware" go?!
Seeing as all 4 of those bottom cards dumped because they ran out of VRAM due to 4k+RT your point isn't as clever as you seem to think.
I thought healthy competition between companies meant the customer wins. This proves not the case. They do just enough to edge the competition when they could do soooo much more for the customer.
There's only healthy competition if the companies are actually competing. Right now AMD does not seem to be concerned about gaining market share in the consumer space so they're content pricing around nvidia. If Intel can compete at a higher performance level without blowing the power budget and decide they want consumer market share things may change after their next product lands.
shark

It's also very wildly different from Jarred's Alan Wake 2 chart at 4K with RT on.
I'd like to know what caused such massive differences between the TP and TH reviews.
(I'm not sure why I quoted you, specifically. Sorry. : P)
If I had to guess probably some sort of upscaling settings disparity (there's also one or two settings that go to ultra so maybe they're set to high here vs ultra there). Comparing CP2077 results both sites are pretty much margin of error of one another.
considering the previous generation 3080 MSRPd for $799
*$699
I mean I think we all know WHY nVidia didn't put 20-24GB VRAM on it, product segmentation, if you need the VRAM for professional use why buy a much more expensive 4090 when a 4080S would suffice, but it still, $1150 (or even $1000) for 16GB VRAM is just an insult (not saying it's needed NOW, but how well it hold up in 2 years?)
Technically 16GB can be had at $800 now from nvidia (I don't count the 4060 Ti bandaid) with the 4070 Ti Super! I do agree with your point though and VRAM is why I didn't buy a 3080 on launch, and by the time prices started getting back to sane I got the 12GB model instead.
 
Is that what you said when gaming transitioned from 2D to 3D? Or when anti-aliasing was introduced? Or when we got shaders? HDR rendering? Tesselation? Subsurface scattering?

If the rapid evolution of PC gaming graphics disturbs you so much, why not just buy a console and be done with it? You can get a PS5, Xbox Series X, or Switch for far less than an RTX 4090. People who spent that kind of money obviously want to see a return on their investment. Besides, nobody really needs an RTX 4090. Cyberpunk 2077 and Alan Wake II, will run on weaker GPUs if you're willing to lower the graphics. And if you're patient enough, they will run in Path Tracing at 4k at acceptable performance on your budget RTX 7050 when it releases in ~5 years.

What are you complaining about? That you don't get to enjoy new technologies that improve graphics quality without having to spend money on a new GPU? Welcome to the real world.
RT should be mainstream when it is ready and as you said, might take another 5 years. For past 6 years it has not been unless you paid $2k or are were happy with 30 to 60 FPS gaming (state of PC Gaming in 2024 according to nvidia).
The fact that you are comparing RT off/on to 2D/3D, I'm simply speechless here.
In "real world" if people try to sell items at above their real prices, they are considered a fraud and a cheat and are treated as such, something nvidia has been able to get away with for years.
 
i had a 4k screen 8 years ago and swapped to 2k 3 years ago... Most of the time I had 4k I couldn't run it at that resolution anyway, and windows defaults 4k screens to 150% scale so its essentially running as 2k anyway.
I swapped so I could run screen at native resolution and not have to deal with the few applications I had that would run their UI at native on a 4k screen... Logitech software was one, I couldn't read the interface.

You don't have to use Windows default scaling. How long does it take to right click on your desktop and choose display settings and then change the scaling option to 100% one time? You bought a different monitor so you didn't have to do that?
 
  • Like
Reactions: Order 66
You bought a different monitor so you didn't have to do that?
It wasn't the only reason, just the one that comes to mind. it was a 27inch 4k display, I updated to a bigger screen size but lower resolution.The 4k screen was 5 years old, it was in need of a replacement. I had a new PC, I wanted to make everything else new too.

One of my older games would play at native and it made it harder as instead of increasing how good everything looked, it would expand the play area and mean things that couldn't normally see you, could.. in an arpg you don't necessarily want to fight everything and the extended draw distance also meant the enemies could see further. Harder to avoid fights.
 
RT should be mainstream when it is ready and as you said, might take another 5 years. For past 6 years it has not been unless you paid $2k or are were happy with 30 to 60 FPS gaming (state of PC Gaming in 2024 according to nvidia).
The fact that you are comparing RT off/on to 2D/3D, I'm simply speechless here.
In "real world" if people try to sell items at above their real prices, they are considered a fraud and a cheat and are treated as such, something nvidia has been able to get away with for years.

So, if it can't successfully happen overnight, it shouldn't happen at all?

That's no way to go forward.

Every technology was expensive to render and GPU taxing when it was new. That includes RT.

And what makes you think ray tracing isn't ready? Because two games decided to crank all parameters to beyond the current limits?

The neat thing about ray tracing is that you can scale its settings to infinity if you want. 5 bounces and 3 rays per pixels isn't pretty enough? How about 1024 rays per pixel and 65 000 bounces per ray? I guarantee you won't find a GPU that can run those settings at over 1 FPS for many many years.

Ray tracing is here to stay, whether you like it or not. Why? Because it greatly simplifies and accelerates game development. The improved visuals is just a nice-to-have side effect. Literally the only reason why games don't require RT yet, is because the industry still wants to earn money from those who can't afford a shiny new GPU (yet).

In "real world" if people try to sell items at above their real prices, they are considered a fraud and a cheat and are treated as such, something nvidia has been able to get away with for years.

What exactly is the fraud right here?

It's not like Nvidia's cards don't support RT. The potential, is there. It's up to game developers to find ways to meaningfully implement it into modern gaming.
 
Last edited by a moderator:
As a ultrawide user I agree.

I saw the Tax for 4k gaming and it wasn't worth it. And even the mighty 4090 still struggles in some games at 4k.

For me personally I prefer 1440 UW at 100-144fps than 4k 60.

Higher FPS matter more to me on a 144hz display.
Indeed.
I actually "downgraded" from 4K@60Hz to an OLED Ultrawide 3440x1440 @ 175Hz... no regrets whatsoever.
Next big update will be when 5K Ultrawides, of OLED level picture quality, are widely available (and maybe a 6080/90 or something).
 
I'm confused as to the mental games Nvidia plays with its customers.

Why not create a 4080 label on a $750 card?
Why not create a 4070 label on a $500 card?

I'm sure all of you would respond "because money"

But what I'm saying is this: "What the 4080 is should have been labeled as a 4090. And the current 4090 renamed a titan or 4100"

Profits rename the same. Reviewers can compare 3080 to 4080 and say "price is justifiable for small boost for the performance class". While the boost would be disappointing , at least there would still be good will for not trying to force gamers into a higher budget they clearly can't afford.

The only flaw here is the lower end like the 4060 and 4060 ti which is HORRIBLY priced compared to previous gen and performance. Nvidia will have to accept the fact their customer base can't afford $500 for a subpar card. They need to make a $300-$350 60 class card that doesn't lose performance.

I think Nvidia saw all the fat profits during crypto and assumed a lot of those people buying cards were gamers and not miners who could offset the cost.

I can also guarantee you this: All Blackwell cards will have much stronger AI with increased memory. They will push the bleeding edge of what the state dept will allow to be exported for AI tech. Basically they will aim for another crypto boom with AI and give the average man the middle finger.
 
Last edited:
  • Like
Reactions: Order 66
Sounds like you are not the target market for this product considering you are on a 7 year old mid range gpu.
No I'm not the target for this card. That's why I said I'd only consider a rx 7600 xt with 16 GB VRAM.
I also went online and saw 5 different rigs running 19 different games and most of the games were using more than 8 GB VRAM on the rx 7600 xt. Most were using about 10 GB.The 4 other rigs that have 8 GB VRAM were doing fine and not running out of VRAM.
I'm not saying the rx 7600 xt is better than the others but it is using up to the 16 GB VRAM that the card has. The only game using less than 8 GB VRAM were older games that don't need it.
But the games were using more than 8 GB VRAM. So saying the rx 7600 xt doesn't use all that VRAM is not true.
 
  • Like
Reactions: Order 66
No I'm not the target for this card. That's why I said I'd only consider a rx 7600 xt with 16 GB VRAM.
I also went online and saw 5 different rigs running 19 different games and most of the games were using more than 8 GB VRAM on the rx 7600 xt. Most were using about 10 GB.The 4 other rigs that have 8 GB VRAM were doing fine and not running out of VRAM.
I'm not saying the rx 7600 xt is better than the others but it is using up to the 16 GB VRAM that the card has. The only game using less than 8 GB VRAM were older games that don't need it.
But the games were using more than 8 GB VRAM. So saying the rx 7600 xt doesn't use all that VRAM is not true.

Yes and no. With a few rare exceptions 8GB at 1080p is fine. You can run 2K and hit the bottleneck a lot more. But the 7600 is not meant for 2k gaming. 2k gaming is 7700xt and 7800xt class stuff.

Will that hold in the future? Hard to say. But of those few games that need more than 8GB at 1080p also chew through raster, not just memory. As the 7600 is an entry class raster card, you are f'd anyway. And the new ue5 game engines just eat that stuff up with no apologies. It does NOT scale well on lower end hardware.

Btw: games as released might consume say 6GB of memory. Then when a game ready driver comes out you see it increase to 8GB. You see, Intel, and, and Nvidia have tools which allow them to see which resources, shader programs and more a game is using and at what percentage of use. They can hand pick or even replace shader programs and textures to be better optimized for a given architecture. They then force these optimized parts to hold in GPU memory where the GPU can get quick access to them. For example AMD might see there are 16 FMA (fuse multiply add) ops in a shader right in a row. But their pipeline might only allow for 4 FMA ops at a time. But they have 4 larger wider vector operator that can take 4 FMA ops at the same time. (4 resources that take 4 FMA ops at a time = 16). This means you can operate all 16 in a few clock ticks more efficiently. They can turn overwrite the less efficient shader written by the developer and force it to be cached in the. GPU. The game ready driver is responsible for this.
 
  • Like
Reactions: Order 66
The cost of putting 32GB instead of 16GB is probably $70, maybe $100 at most.

Nvidia does have such cards. They're called RTX 5000 Ada, and use AD102 instead of AD103. The cost starts at $4,000.
Also, the memory clock on those cards is slightly lower, possibly having to do with the fact that they need 2x as many GDDR chips.

One thing I find very interesting right now is that Nvidia doesn't offer a professional "RTX xxxx Ada Generation" card that uses AD103. I'm not sure why that is, but certainly there's room for something between the 4500 and 5000 cards.
I read something interesting about why the original RTX 4080 was such a relatively poor value. Apparently, its tensor cores weren't nerfed in the same way as the RTX 4090, making it more interesting for training smaller AI models. If true, it could've hurt the value proposition for a pro card. I wonder if that's still true of the RTX 4080 Super.

Another possible explanation is that maybe the AD103 had some issue where its memory controller couldn't drive the extra chips needed for the double-capacity DRAM configuration. That seems unlikely, however.
 
Last edited:
  • Like
Reactions: Order 66
IMO ultrawide 1440p monitors are more useful overall than a standard 4k monitor. Every person I know that has gone to an ultrawide will not go back to a standard 16:9 or 16:10 monitor. The additional width is so nice in both productivity and gaming.
Eh, I have a 32" 4k monitor at work and 2x 27" 1440p monitors at home. The dual-monitor setup has certain advantages, regarding window-positioning. I tend to find that I arrange my windows a bit differently on each setup.

One thing that's nice about the 4k monitor is viewing email and basically anything where you want a lot of vertical resolution. When I open MS Outlook on that monitor, it feels like God Mode.

P.S. I use 100% font scaling on both setups. I don't let any pixel go to waste!
; )
 
With that said though, it is annoying to have games that don't account for the fact that you have more VRAM than the game requires. there is nothing more that I hate when it comes to image quality in games than texture pop-in when my system is way overkill for said game, even at the highest in-game settings.
Games don't really need to "account for more VRAM", they just need to be written to only evict resources from VRAM when no longer needed or asked to free up VRAM by the OS/API, whatever the VRAM amount may be. Kind of like how the Windows file system cache keeps everything in spare RAM until RAM is full or cache needs to get freed up for something else.
 
  • Like
Reactions: Order 66
Games don't really need to "account for more VRAM", they just need to be written to only evict resources from VRAM when no longer needed or asked to free up VRAM by the OS/API, whatever the VRAM amount may be. Kind of like how the Windows file system cache keeps everything in spare RAM until RAM is full or cache needs to get freed up for something else.
Ok, but I just feel like games should be coded to take advantage of extra resources to prevent pop in, especially if there is enough headroom to do it without going below the monitor’s native refresh rate. The way I think of this as being able to run a game at more than 60fps despite only having a 60hz monitor. Maybe something that can be done with AI/ML, optimizing graphics settings on a per game basis (maybe through the driver) to be able to deliver the best image quality at (hopefully) native resolution at the target frame rate. Maybe something like this already exists, I don’t know. I know that there are guides on how to do it, but AFAIK, there is nothing that can do it automatically.
 
It wasn't the only reason, just the one that comes to mind. it was a 27inch 4k display, I updated to a bigger screen size but lower resolution.The 4k screen was 5 years old, it was in need of a replacement. I had a new PC, I wanted to make everything else new too.

One of my older games would play at native and it made it harder as instead of increasing how good everything looked, it would expand the play area and mean things that couldn't normally see you, could.. in an arpg you don't necessarily want to fight everything and the extended draw distance also meant the enemies could see further. Harder to avoid fights.
27" is too small for a 4k screen. Fine for 1440p, but I wouldn't go smaller than 32" for a 4k screen. If you have to scale, you end up wasting most of the gained screen real estate as you said.
 
27" is too small for a 4k screen. Fine for 1440p, but I wouldn't go smaller than 32" for a 4k screen. If you have to scale, you end up wasting most of the gained screen real estate as you said.
I second this, and third and fourth it as well! Because I have five 4K displays floating around my office, all hooked up to different PCs. Four of them are 27-inch I'm pretty sure, possibly 28-inch. Only one is 32-inch. I love that 32-inch display (except for the curve, which I would happily do without).
 
Ok, but I just feel like games should be coded to take advantage of extra resources to prevent pop in
They should, and keeping tabs on stale resources to decide what can be dumped to free-up memory and pre-load stuff more likely to be called on soon isn't horribly difficult either. But it is extra work and predicting wrong will still cause pops.

27" is too small for a 4k screen. Fine for 1440p, but I wouldn't go smaller than 32" for a 4k screen. If you have to scale, you end up wasting most of the gained screen real estate as you said.
I personally like text fonts with enough pixels to them to have some curviness to them instead of the bare minimum of pixels necessary to make them readable.
 
  • Like
Reactions: Order 66
By what reasoning?!? It's literally the price of a PS5 + an okay work laptop. Can I afford it? Yes. Can I justify it it? No, no and no again.

I never thought I'd see the day that my other hobby (running a car for trackdays) is cheaper than gaming on PC. Full set of slicks + 3 trackday entries or a GPU? Hmmm, let me think about that real quick... PC Gaming does not exist in a vacuum, I'll happily become a full-time console peasant and go and do a whole bunch of other stuff with my money.
The GPU will last you much longer than your 3 track days. This is not a good comparison. Try again.

And remember, no one is forcing you to upgrade.