News Nvidia defends RTX 5050 desktop GDDR6 decision — says power-efficient GDDR7 is a better choice for laptops

A nice idea for an article would be "Nvidia defends the indefensible", recapitulating some of the most arguable choices they made in the last few years and how they justified them publicly.

If they really wanted to go with GDDR6, I would have liked this better if they had used a 192bit interface with 12GB VRAM, like the old 3060, but I guess that would have risked to really show why the 5060 8GB is so terrible.
 
geforce-rtx-5050-desktop-gpu-performance.png

geforce-rtx-5050-laptop-gpu-performance.png

There's no benefit to using GDDR6 over GDDR7 on a desktop card.
The laptop 5050 uses GDDR7 24Gbps.
There is a significant power limit on the laptop version. Laptop has 80W vs. the desktop's 130W maximum.
Looking at the numbers nvidia posted, IDK if 50W is enough to explain the performance difference between the desktop and laptop version.

So I'm going to assume nvidia used GDDR6 on the desktop 5050 as a cost cutting measure and pocketed the savings.

Also, I totally expect nvidia to pull nvidia shenanigans and offer a desktop 6GB GDDR7 version, a laptop GDDR6 version, and maybe even a laptop 4GB version with no labels to indicate they're different from the 8GB version.
If they decide to offer a 12GB (96-bit) version, it'll be called the 5050Ti.
 
Last edited:
A nice idea for an article would be "Nvidia defends the indefensible", recapitulating some of the most arguable choices they made in the last few years and how they justified them publicly.

If they really wanted to go with GDDR6, I would have liked this better if they had used a 192bit interface with 12GB VRAM, like the old 3060, but I guess that would have risked to really show why the 5060 8GB is so terrible.
They are using an even smaller GB207 die in this. 5060 uses GB206. Fat chance of 192-bit. 12 GB or 9 GB on 96-bit would be pretty funny though.

In fact, the 9 GB isn't so unlikely since GDDR7 could balance out the smaller memory bus, and from laptop we can see it's designed to use either GDDR6 or GDDR7. Wow.

There is nothing wrong with what they did to cut costs in the 5050. Only the price is wrong: it should be $200 MSRP instead of $250.
 
Last edited:
the card needs to be under 250 usd more 180 usd.
5060 is 20% more at $300 for +50% CUDA cores, but probably less perf gain. Hardware Unboxed guessed +30%.

So this card would have better price/perf at $200, and maybe as high as $225. No need to go to $180, or unrealistic demands like $150. The $250 MSRP is awful though.

The more pressing concern is getting a 12 GB version of the 5060.
 
  • Like
Reactions: artk2219
If the chip is not memory bandwidth limited, then GDDR6 is a pure cost saving with no performance penalty in actual applications (rather than memory bandwidth benchmarks). Why GDDR7 for mobile? Power savings, and takes less of the limited thermal budget that can then be used by the CPU die, which will be slower than a non-thermally-limited desktop 5050 anyway.
 
Nvidia it's only thinking in get tons of gddr6 to keep Amd memory starvation high.
A simple 5050 will eat a big market share from Amd. And will push the Amd products to higher price, making it less attractive...

The worse nvidia product can literally destroy the entire Amd market.
 
I'm really psyching myself up to criticising the 5050. It'll be a nice change, because dumping on the 3050 has started to get a bit tiresome.

If Nvidia cares at all about avoiding my scathing disapproval (and why would they not care so much about the valuable consumers who helped them become the monstrous corporation they are today?) then that price needs to be as other people have suggested: 200 but ideally less.
 
I feel like it's pretty obvious: A GPU of this tier doesn't benefit from the extra memory bandwidth GDDR7 provides, and it's a lot more expensive too. For the mobile version, GDDR7 makes sense since everything in laptops is horribly overpriced anyway so the extra cost isn't going to get noticed as much as the power savings from using more efficient memory.
 
Can't believe people are complaining about the literal lowest tier Nvidia card being engineered for cost savings.
If you want a better card then pay up!

The Geforce 5060 uses GDDR7 if that's what were after.

RTX 5090RTX 5080RTX 5070 TiRTX 5070RTX 5060 TiRTX 5060RTX 5050
NVIDIA ArchitectureBlackwellBlackwellBlackwellBlackwellBlackwellBlackwellBlackwell
DLSSDLSS 4DLSS 4DLSS 4DLSS 4DLSS 4DLSS 4DLSS 4
AI TOPS335218011406988759614421
Tensor Cores5th Gen5th Gen5th Gen5th Gen5th Gen5th Gen5th Gen
Ray Tracing Cores4th Gen4th Gen4th Gen4th Gen4th Gen4th Gen4th Gen
NVIDIA Encoder (NVENC)3x 9th Gen2x 9th Gen2x 9th Gen1x 9th Gen1x 9th Gen1x 9th Gen1x 9th Gen
NVIDIA Decoder (NVDEC)2x 6th Gen2x 6th Gen1x 6th Gen1x 6th Gen1x 6th Gen1x 6th Gen1x 6th Gen
Memory Configuration32 GB
GDDR7
16 GB
GDDR7
16 GB
GDDR7
12 GB
GDDR7
16 GB / 8 GB
GDDR7
8 GB
GDDR7
8 GB
GDDR6
Memory Bandwidth1792 GB/sec960 GB/sec896 GB/sec672 GB/sec448 GB/sec448 GB/sec320 GB/sec
 
Last edited:
  • Like
Reactions: TheSecondPower
The simple fact of the matter is giving the desktop gddr7 would’ve increased performance from a couple percentage points at minimum to like over 5% in some cases. The bottom line is they gimped the desktop model to save a couple bucks without even passing the savings along to the customer.
 
  • Like
Reactions: artk2219
A nice idea for an article would be "Nvidia defends the indefensible", recapitulating some of the most arguable choices they made in the last few years and how they justified them publicly.

If they really wanted to go with GDDR6, I would have liked this better if they had used a 192bit interface with 12GB VRAM, like the old 3060, but I guess that would have risked to really show why the 5060 8GB is so terrible.
What!? They don't even use a 192-bit interface on the 5060 series. Saving money to go with GDDR6 only to increase bus width and use more mem modules doesn't make sense. Desktop 5050 is just a low-end, budget GPU, plain and simple. Hardware enthusiasts might see it as trash ("waste of sand" per Steve at Gamers Nexus, for example), but more gaming GPU choices for consumers is a good thing, regardless of all of the things we can ding nVidia for.

Limited GDDR7 availability is probably one of the main reasons that virtually all of the 50-series RTX cards have had limited supply at launch. It makes for a better product, but it appears that AMD didn't want to take that risk on supply and therefore went with GDDR6 on RDNA4, even as going all the way back to RDNA2, there was GDDR6. No one was a crystal ball.

It's not indefensible, but generically saying "GDDR6 is the best choice for desktop 5050" leaves out the explanation on why it's the best choice... I think if we're all honest, we know it's cost and probably also supply availability as this article mentioned. The RTX Pro 6000 is now commercially available and is gobbling up 96 GB of GDDR7 per GPU, so that's only pulling supply down further. On the upside, 2nd-gen GDDR7 is coming later this year, which I assume would be additional online manufacturing capacity in addition to existing 1st-gen GDDR7.
 
Nvidia it's only thinking in get tons of gddr6 to keep Amd memory starvation high.
A simple 5050 will eat a big market share from Amd. And will push the Amd products to higher price, making it less attractive...

The worse nvidia product can literally destroy the entire Amd market.
I don't know if it's that apocalyptic for AMD, but I choose to believe it anyway since Jensen would approve.
 
  • Like
Reactions: Amdlova
Can't believe people are complaining about the literal lowest tier Nvidia card being engineered for cost savings.
If you want a better card then pay up!

The Geforce 5060 uses GDDR7 if that's what were after.

RTX 5090RTX 5080RTX 5070 TiRTX 5070RTX 5060 TiRTX 5060RTX 5050
NVIDIA ArchitectureBlackwellBlackwellBlackwellBlackwellBlackwellBlackwellBlackwell
DLSSDLSS 4DLSS 4DLSS 4DLSS 4DLSS 4DLSS 4DLSS 4
AI TOPS335218011406988759614421
Tensor Cores5th Gen5th Gen5th Gen5th Gen5th Gen5th Gen5th Gen
Ray Tracing Cores4th Gen4th Gen4th Gen4th Gen4th Gen4th Gen4th Gen
NVIDIA Encoder (NVENC)3x 9th Gen2x 9th Gen2x 9th Gen1x 9th Gen1x 9th Gen1x 9th Gen1x 9th Gen
NVIDIA Decoder (NVDEC)2x 6th Gen2x 6th Gen1x 6th Gen1x 6th Gen1x 6th Gen1x 6th Gen1x 6th Gen
Memory Configuration32 GB
GDDR7
16 GB
GDDR7
16 GB
GDDR7
12 GB
GDDR7
16 GB / 8 GB
GDDR7
8 GB
GDDR7
8 GB
GDDR6
Memory Bandwidth1792 GB/sec960 GB/sec896 GB/sec672 GB/sec448 GB/sec448 GB/sec320 GB/sec
You're missing the point, which was:
The laptop 5050 gets GDDR7
The desktop 5050 gets GDDR6

Also, the difference between 5050 and 5060 is not only the GDDR type. You get 50% more Shaders/TMUs/ROPs, making the 5050 an even worse value proposition. It's 2/3 the card for 5/6 the price.
 
Nvidia it's only thinking in get tons of gddr6 to keep Amd memory starvation high.
A simple 5050 will eat a big market share from Amd. And will push the Amd products to higher price, making it less attractive...

The worse nvidia product can literally destroy the entire Amd market.
Sad to see you still haven't changed dealer or talked to any therapist.
 
  • Like
Reactions: P.Amini and jlake3
While I generally like GN for testing, like everyone else in the enthusiast community their opinions are slanted towards power-users and gamers. The 5050 is just the 50 series version of those older xx30 / xx40 cards which are almost always sold to OEMs. I imagine a bunch of these being put into Dell desktops / etc..

I doubt they would have many complaints, if the price were a bit lower, and it didn't need external power, like xx50 cards have in the past. No PCI-E power is what made cards like the GTX 1050ti so popular back in the day.

Also, if Nvidia would stop is their lying to consumer using fake frames. That was one of his big complaints, along with the shady way Nvidia handled the 5060 reviews.
 
The VRAM hubbub aside--which as usual amounts to empty rants, as nobody here would buy a 5050 anyway--an interesting factoid is that Nvidia has started listing AI TOPS specs for their GPUs (5060 = 614, 5050 = 421, 4060 = 242, 2060 = 52),



The implication is that any PC with a dGPU will pass MS' "AI PC" requirement of 45 TOPS, and Nova Lake in '26 will have NPU w/ 75 TOPS. I assume AMD will have similar. So come '26, all PCs will be "AI PC" by default. I expect MS will get the AI stuff out of "public beta" and into Win 12 by then.
 
You're missing the point, which was:
The laptop 5050 gets GDDR7
The desktop 5050 gets GDDR6

Also, the difference between 5050 and 5060 is not only the GDDR type. You get 50% more Shaders/TMUs/ROPs, making the 5050 an even worse value proposition. It's 2/3 the card for 5/6 the price.
I'm not saying the Geforce 5050 is a good deal, only that it was engineered to be the cheapest price.
Like I said before if you want a better card then you gotta pay more.
The 5060 Ti is the lowest I'd use in a gaming PC.
 

TRENDING THREADS