News Nvidia Benchmarks Show 4080 12GB Up to 30% Slower Than 16GB Model

blacknemesist

Distinguished
Oct 18, 2012
490
98
18,890
Damn, the 4080s really pale in comparison to the 4090 by quite a bit.
This really is the generation where diminishing return are not a thing(apparently, still waiting on AMD and real 4080 benchmarks vs selective).
Quite odd how MSFS is the only title where the 4080 16gb gain more % over the 12gb than the 4090 against the 4080 16gb that being 26% and 16% respectively or 0/26/16. Maybe the CPU is capped?
 
Last edited:
Damn, the 4080s really pale in comparison to the 4090 by quite a bit.
This really is the generation where diminishing return are not a thing(apparently, still waiting on AMD and real 4080 benchmarks vs selective).
Quite odd how MSFS is the only title where the 4080 16gb gain more % over the 12gb than the 4090 against the 4080 16gb that being 26% and 16% respectively or 0/26/16. Maybe the CPU is capped?
MSFS is 100% CPU limited on a 4090, even at 4K. DLSS 3 doubles the performance because of this.
 
MSFS is 100% CPU limited on a 4090, even at 4K. DLSS 3 doubles the performance because of this.
Hm... I'll nitpick this, sorry: "DLSS3 doubles the performance". No... Any form of interpolation doesn't increase "performance", it just hides bad/low framerates using the GPU extra room and that's all. I get the idea behind simplifying it, but don't give nVidia's marketing any funny ideas, please. EDIT: No ascii faces; sadge.

As for the reporting itself... Well, nVidia thinks it's better to justify an 80-class at $900 instead of a 70-class, so nothing much to do there. That being said, it would've been nice to actually see a 4070 with 12GB beat the 3090ti for a repeat of the 2080ti memes. Such a missed opportunity, nVidia. For reals.

Looking forward to power numbers as I'm suspecting these 2 SKUs, no matter how nVidia decided to call them, so be quite good performers per power unit (efficient?). Not sure if price-wise they'd be justified, but we'll see. Plus, these are the ones most people (maybe, kinda?) can get xD

Regards.
 
  • Like
Reactions: atomicWAR
Now the only thing Nvidia needs to drop is the charade that there is such a thing as a 4080 12GB. A 4070 by any other name is still overpriced.
In my opinion its more like 4080 12gb = 3060 tier, 4080 16gb = 3070 tier, 4090 24gb = 3090 tier. This is on the premise that the elephant in the room, the 4090 ti, is 15%+ more powerful than the 4090. I support my conclusion by comparing die sizing with the 3000 series equivalents and the performance disparity between the 4080 16gb and the 4090. The 4080 12gb comes in at 295 mm² and the 3060 is at 276 mm², the 4080 16gb comes in at 379 mm² and the 3060 ti / 3070 is at 392 mm², the 4090 is a 608 mm² chip while the 3080 ti / 3090 / 3090 ti is at 628 mm². There is plenty of room for a 4080 ti card and a 4090 ti card.
 
Hm... I'll nitpick this, sorry: "DLSS3 doubles the performance". No... Any form of interpolation doesn't increase "performance", it just hides bad/low framerates using the GPU extra room and that's all. I get the idea behind simplifying it, but don't give nVidia's marketing any funny ideas, please. EDIT: No ascii faces; sadge.

As for the reporting itself... Well, nVidia thinks it's better to justify an 80-class at $900 instead of a 70-class, so nothing much to do there. That being said, it would've been nice to actually see a 4070 with 12GB beat the 3090ti for a repeat of the 2080ti memes. Such a missed opportunity, nVidia. For reals.

Looking forward to power numbers as I'm suspecting these 2 SKUs, no matter how nVidia decided to call them, so be quite good performers per power unit (efficient?). Not sure if price-wise they'd be justified, but we'll see. Plus, these are the ones most people (maybe, kinda?) can get xD

Regards.
Seems we have already been conditioned to pay double for similar tier parts of 7+ years ago. gone are the days of getting a XX70 tier part like the 970 or 1070 for less than 400 dollars. Now we get similar tier parts for 700+ dollars. Its like Moores law has infected not only the performance, but the price too.
 
Seems we have already been conditioned to pay double for similar tier parts of 7+ years ago. gone are the days of getting a XX70 tier part like the 970 or 1070 for less than 400 dollars. Now we get similar tier parts for 700+ dollars. Its like Moores law has infected not only the performance, but the price too.
Well, in regards to the specific bit of Moore's Law: Intel was very explicit in telling nVidia/Jensen to STFU with that, no?

And this is not their first rodeo with the die naming/size and card naming relative to pricing tiers. They know they can get away with it.

I think it was Gordon from PCWorld* that said it best: "as long as you keep buying these at those price points, whining won't do diddily squat". And he's absolutely right. It appears to me there's plenty of people with enough money to just accept these price hikes like nothing, as nVidia keeps doing it, so... Not sure what else to say or do here. Sadge.

Regards.
 
  • Like
Reactions: atomicWAR

Giroro

Splendid
Honestly these cards could get 10x performance over the previous gen, and I still wouldn't care because of the pricing.
Nvidia's profit margins increase every generation. The obvious monopoly price gouging is way past the breaking point for me.

evgagraph.png
 
Last edited:

atomicWAR

Glorious
Ambassador
So far this generation feels like a really bad joke. The 4080 16GB is under performing by a good 9-12%. The 4080 12GB isn't even an 80 class card, its a 70 class card. Which means that every sku below it got bumped up too. 50->60 class, 60->70 class and as discussed 70->80 class.

Insanesly Nvidia made the RTX 4090 the best price to performance this gen. Nvidia is leaving a bad taste in my mouth. I might just suffer AMD drivers just to make a point this gen depending on RDNA 3 pricing/performance. Otherwise if this becomes the new norm in PC gaming...I might not be a PC gamer much longer. I'd hate to see tech greed kill PC gaming. But it might happen at this rate...
 
In my opinion its more like 4080 12gb = 3060 tier, 4080 16gb = 3070 tier, 4090 24gb = 3090 tier. This is on the premise that the elephant in the room, the 4090 ti, is 15%+ more powerful than the 4090. I support my conclusion by comparing die sizing with the 3000 series equivalents and the performance disparity between the 4080 16gb and the 4090. The 4080 12gb comes in at 295 mm² and the 3060 is at 276 mm², the 4080 16gb comes in at 379 mm² and the 3060 ti / 3070 is at 392 mm², the 4090 is a 608 mm² chip while the 3080 ti / 3090 / 3090 ti is at 628 mm². There is plenty of room for a 4080 ti card and a 4090 ti card.
Those die sizes are correct, but the same size basically costs double what it did before. So from that perspective, plus the massive transistor density increase, bumping things up a tier for roughly equivalent size makes sense. Nvidia should get 120~142 potential GA106 chips per wafer (for 3070), versus 170~193 AD104 chips per wafer (RTX 4080 12GB and probably 4070 as well). Assuming TSMC 4N wafers cost twice as much (possibly more), that works out to 50% more per AD104 chip. And Nvidia is NOT going to eat that difference if the new part is faster; it will raise prices.

What's interesting is that, even if we assume a price of $15,000 per 4N wafer, that's still only about $78~$88 per chip. And GA106 chips on a $4000 wafer are potentially just $28~$33. But of course there are other costs (packaging, wire bonding, etc.) that those prices probably don't include. Still, the actual silicon costs likely isn't all that high. Hell, even on AD102, at 80~89 chips per wafer (some may be bad), that would be $169~$188 per chip. 😯
 

TheOtherOne

Distinguished
Oct 19, 2013
243
86
18,670
I got my 3070 close to retail price after two months of waitlist. I think I can safely skip this gen's nVidia garbage. Fingers crossed AMD or maybe even Intel bring some real competition and in 2-3 years we finally get GPUs with sensible prices. :sleep:
 

Co BIY

Splendid
This article headline implies that the performance difference is attributable to the RAM difference only.

Maybe for clarity we should all agree that 4080 12b should be referred to as 4080 12GB (Crippled) or 4080 12GB (80% CUDA).

This paragraph (para #10) needs to be the # 1 paragraph each time these cards are discussed because otherwise the key information isn't available when needed by the reader.

While Nvidia's GeForce RTX 4080 12GB and GeForce RTX 4080 16GB carry the same model number, they are based on completely different graphics processors (AD104 vs. AD103) and feature different CUDA core counts (7680 vs. 9728). When combined with distinctive memory bandwidth (504 GB/s vs. 717 GB/s), the two models offer very different performance levels than you would typically expect from graphics boards carrying the same model number.

Yes, this is all Nvidia's fault and not the fault of the writer.
 

Co BIY

Splendid
Does the existence of the REBAR capability make VRAM amounts more important when pushing against CPU limits (or otherwise)?

We have seen how important the existence of REBAR is for the Intel ARC cards. Nvidia cards (IIRC from 30 series) use the same capability but has the lack of REBAR been tested as clearly ?

Is this important for performance on older Intel CPUs (<10 series) and AMD CPUs ?
 

LolaGT

Reputable
Oct 31, 2020
286
258
5,090
You'll find there is nothing very bad to suffer through, all those troubles from 2020 and previously have been mitigated very well or mostly eradicated.
I went from team green to red right before the crypto craze and this 5700 xt has been nothing but a peach.

I might just suffer AMD drivers just to make a point this gen
 

BeedooX

Reputable
Apr 27, 2020
71
53
4,620
So what I'm hearing is AMD saying they might not quite beat the 4090 overall, and public murmurings that there's a big gap between the 4090 and 4080 class cards.

If true, the 4090 was built as a halo product to prove a point! In the lower tiers, AMD may obliterate the 4080 class cards...
 
Oh well, I was already thinking about going with the RX 6700 or 6800 (way less expensive in my country than nvidia) a few months ago, so I don't mind waiting for RDN 3, check benchmarks, power consumption and price/performance/frame ratio and then see what it will be.

No need to rush for me.
 
Last edited:

Awev

Reputable
Jun 4, 2020
89
19
4,535
Saying that product A is 30% slower than product B is not the same as saying product B is 30% faster than product A. If we start with product B at 100 (just to help make things easy when doing math), and make product A 30% slower then product A is only producing 70. Now if we take product A at 70 and make it 30% faster we end up with product B producing 91, which is not the 100 we started with, even tho product A has not changed from the 70.

I was planning on sitting out this generation, and one of my cats pressed the issue earlier this year when it spilled a glass of water on my desk onto my computer next to said desk. I rebuilt half of the computer, upgraded it some, yet I still need to upgrade the GPU, RAM, and one of my two monitors. Based on my current CPU a 6800 or 3080 would be a good choice for me, while I do like the frame rates of the 4090 for 1440p displays my rebuilt computer would bottleneck it serverly, and my wallet tells me to keep dreaming.