News Alleged GDDR6X shortage could briefly hinder Nvidia gaming GPU supply starting August — affected models include RTX 4070 through RTX 4090

Status
Not open for further replies.
Honestly, I think that GDDR6X is pretty overrated because Radeons just use GDDR6 and there hasn't been any notable performance deficit that anyone has noticed. The only advantage that I've seen related to GDDR6X has been lower power draw at the same clock speeds. It hasn't really translated into any real performance gains.

There's a Tom's article from last year that chronicles just how meaningless the difference between GDDR6 and GDDR6X is. The articles analysis is based on performance differences between an RTX 3060 with GDDR6 and an RTX 3060 with GDDR6X:

Nvidia RTX 3060 Ti Gets Few Benefits from GDDR6X, Says Reviewer

 
  • Like
Reactions: thisisaname
Honestly, I think that GDDR6X is pretty overrated because Radeons just use GDDR6 and there hasn't been any notable performance deficit that anyone has noticed. The only advantage that I've seen related to GDDR6X has been lower power draw at the same clock speeds. It hasn't really translated into any real performance gains.

There's a Tom's article from last year that chronicles just how meaningless the difference between GDDR6 and GDDR6X is. The articles analysis is based on performance differences between an RTX 3060 with GDDR6 and an RTX 3060 with GDDR6X:

Nvidia RTX 3060 Ti Gets Few Benefits from GDDR6X, Says Reviewer

I think GDDR6X's performance advantage is much more relevant at the high-end, largely as you mentioned to provide better performance-per-watt, which indeed 4070-4090 provide a little better PPW than their Radeon counterparts. That might also be largely true going to DDR7 as otherwise RDNA4 wouldn't hold back as it's rumored to be -- even without a powerhouse high-end model.
 
  • Like
Reactions: KyaraM
Amd is throwing cache and more cache to counter lowe bandwidth of memory.
Nvidia has better gddr and power efficiency.
And 4 of 5% performance for only change memory is good enough. If gddr7 can achieve 20% like the rumors do will another level of graphics.
It's basically free performance with out increasing the melting power
 
How many GPUs does nVidia™ sell in China? All we here are stories about nVidia's™ GPU supply in China. Is China their biggest market or something? I don't get it.
 
I think GDDR6X's performance advantage is much more relevant at the high-end, largely as you mentioned to provide better performance-per-watt, which indeed 4070-4090 provide a little better PPW than their Radeon counterparts. That might also be largely true going to DDR7 as otherwise RDNA4 wouldn't hold back as it's rumored to be -- even without a powerhouse high-end model.
Radeon doesn't make a real powerhouse model mostly because they know that people with that kind of money to spend on a video card all want GeForce so the rather steep cost incurred in creating said card would be a complete waste.
 
"Shortages" every time they want to scalp and create artificial scarcity. Just use HBM on-die chiplets. Way higher bandwidth. Its time for 1024-bit GPU memory.
 
Radeon doesn't make a real powerhouse model mostly because they know that people with that kind of money to spend on a video card all want GeForce so the rather steep cost incurred in creating said card would be a complete waste.
Totally not true. If the 4090 was amd's it would sell like hotcakes.
 
What? The wind shifted direction and is now blowing slightly more to the south west? Well that's going to mean an immediate price hike due to the impact that will have on the BoM for all our cards. Sorry, hopefully supply levels will be back to normal the 2nd quarter of pay me now.
 
Status
Not open for further replies.