News AMD rumored to use 'slow' 18Gbps GDDR6 in RDNA 4 — leaker says Team Red's next generation GPUs won't use GDDR7

Notton

Commendable
Dec 29, 2023
725
631
1,260
I think this rumor is based off of story back in February that only nvidia is buying large quantities of GDDR7.
AMD, on the other hand, has been ordering tons of GDDR6 20Gbps.

IDK if anything has changed since then, but it seems kind of late to start ordering GDDR7 for a Q3 launch window.
 
  • Like
Reactions: artk2219

usertests

Distinguished
Mar 8, 2013
827
756
19,760
If there's no performance increase, there's probably little need for a bandwidth increase. And new microarchitectures (such as RDNA4) could require less bandwidth for the same results.

There's also the Infinity Cache to consider, which regressed in amount in some cases between RDNA2 and RDNA3, but can be a lot faster. ~1800 GB/s "effective memory bandwidth" for the 6950 XT, 3500 GB/s for the 7900 XTX:

https://en.wikipedia.org/wiki/RDNA_2#Desktop
 
  • Like
Reactions: artk2219

ezst036

Honorable
Oct 5, 2018
698
593
12,420
It seems kind of clear that AMD isn't aiming for the ultra high end, so it might be legit. They have tied themselves to the "small die" strategy for a little too long. (See the Radeon 9700 section here: for what I mean. Nvidia is seemingly always the one with a bigger die and that simply means more shader units and so forth to work with.)

GDDR7 would also make for a pretty good refresh 6 months after launch.
 
Last edited:
  • Like
Reactions: artk2219
I think AMD is smartly exploiting the prevailing opinions in the gaming community.

Obviously people are forking out $2000 for 4090’s so it’s not too high of a price, but social media is filled with people ranting about how “high end used to be $450” or “back in my day you could buy a voodoo for a nickel” (tongue in cheek on the latter example).

So, there will be 2 mindsets about RDNA4. 1: AMD can’t compete with Nvidia so they are pivoting to being a bargain bin supplier, or 2: AMD is listening to the market and customers.

I think if they hit it out of the ballpark, mindset # 1 would simply sound pedantic. By ballpark I mean AMD releases a 7900 XTX grade card with 24 GB’s, improved raster and improved ray tracing for $599, have drivers optimized before launch, and release a competitive AI enhanced FRS they have been teasing. If this happens, there’s not much to complain about, hardware “obsession-ists” (much more accurate vs enthusiasts, and yes I am calling myself obsessed) get their halo card and pay their “flex tax”, while AMD puts out cost effective “every-man” cards and builds market share.

We will see how it goes…
 

DavidLejdar

Respectable
Sep 11, 2022
268
158
1,860
Steam Hardware Survey March 2024,
58.45% have 1080p as primary display resolution, while 3.44% have 4K.

Based on that, one could argue, that there isn't really that much market demand for high-end GPUs, when mid-range GPUs can deliver plenty of FPS at below 4K - and that that is where the market is at these days.

I mean, at least myself, I might upgrade GPU in the near future (partially depending on whether AMD stock rebounces at least a little bit, harhar). But that will go hand in hand with upgrading to 4K, and for VR headset. If it wouldn't be for this, even with new titles still enough FPS for me with older GPU at 1440p (albeit perhaps not running every setting always at ultra) .
 

Trake_17

Distinguished
Jul 9, 2011
14
13
18,525
I think AMD is smartly exploiting the prevailing opinions in the gaming community.

Obviously people are forking out $2000 for 4090’s so it’s not too high of a price, but social media is filled with people ranting about how “high end used to be $450” or “back in my day you could buy a voodoo for a nickel” (tongue in cheek on the latter example).

So, there will be 2 mindsets about RDNA4. 1: AMD can’t compete with Nvidia so they are pivoting to being a bargain bin supplier, or 2: AMD is listening to the market and customers.

I think if they hit it out of the ballpark, mindset # 1 would simply sound pedantic. By ballpark I mean AMD releases a 7900 XTX grade card with 24 GB’s, improved raster and improved ray tracing for $599, have drivers optimized before launch, and release a competitive AI enhanced FRS they have been teasing. If this happens, there’s not much to complain about, hardware “obsession-ists” (much more accurate vs enthusiasts, and yes I am calling myself obsessed) get their halo card and pay their “flex tax”, while AMD puts out cost effective “every-man” cards and builds market share.

We will see how it goes…
"People" are spending $2k for a 4090? Steam surveys say less than 1% of gamers have a 4090, and just 6% of anyone who has bought a current Gen graphics card lately has one. Yeah, "people" are, just not that many. I don't think 4090 sales aren't dictating the market. You could argue the reputation bonus the crown earns is good marketing for all of the other cards, but there's still a substantial market for people unwilling to spend $2k on a card and plenty of room for AMD to compete on cost, which it will be better able to leverage with cheaper RAM, especially if the faster RAM isn't bottlenecking mid-range GPU performance. I mean this strategy, along with diversifying into graphics cards at all, is what helped it survive Intel and look how well that's going lately. Calling it bargain bin sounds pejorative, it's a very viable strategy full excellent products. Especially when people have to start rationing their power as the data centers start hogging the grid to feed their AI. No one will be interested in 6-700 watt GPUs for their computer. Ok maybe that's too far afield, but the rest is sound.
 
Last edited:
  • Like
Reactions: artk2219

Trake_17

Distinguished
Jul 9, 2011
14
13
18,525
Steam Hardware Survey March 2024,
58.45% have 1080p as primary display resolution, while 3.44% have 4K.

Based on that, one could argue, that there isn't really that much market demand for high-end GPUs, when mid-range GPUs can deliver plenty of FPS at below 4K - and that that is where the market is at these days.

I mean, at least myself, I might upgrade GPU in the near future (partially depending on whether AMD stock rebounces at least a little bit, harhar). But that will go hand in hand with upgrading to 4K, and for VR headset. If it wouldn't be for this, even with new titles still enough FPS for me with older GPU at 1440p (albeit perhaps not running every setting always at ultra) .
Stick with a 1440 ultrawide
 
  • Like
Reactions: artk2219

Joseph_138

Distinguished
They could be trying to find ways to keep prices down, to hurt Nvidia's sales. Hitting the market before Nvidia, with prices that are lower than previous gen cards, is going to leave Nvidia in a tough position, if they invested too much in their next generation products to be able to sell them at reasonable prices.
 
  • Like
Reactions: ohio_buckeye
Agreed. They may know their target is primarily the middle of the market. When you can make cards like the 7900xtx that compete with a 4080 using the older memory, then it’s likely fast enough and they can always buy enough gddr7 if they decide to launch a higher end product.
 

hannibal

Distinguished
if the fasted new AMD GPU will be 8700XT with 8600... so no need push memory speeds at all.
That based on early rumours...

Next higend at the end os 2025 or 2026 will be GDDR7. If they go back to highend at all.
 

35below0

Respectable
Jan 3, 2024
1,727
743
2,090
They could be trying to find ways to keep prices down, to hurt Nvidia's sales. Hitting the market before Nvidia, with prices that are lower than previous gen cards, is going to leave Nvidia in a tough position, if they invested too much in their next generation products to be able to sell them at reasonable prices.
I don't think nvidia could be left in a tough position if it was left in a trash compactor for a week.
 
  • Like
Reactions: valthuer
"People" are spending $2k for a 4090? Steam surveys say less than 1% of gamers have a 4090, and just 6% of anyone who has bought a current Gen graphics card lately has one. Yeah, "people" are, just not that many. I don't think 4090 sales aren't dictating the market. You could argue the reputation bonus the crown earns is good marketing for all of the other cards, but there's still a substantial market for people unwilling to spend $2k on a card and plenty of room for AMD to compete on cost, which it will be better able to leverage with cheaper RAM, especially if the faster RAM isn't bottlenecking mid-range GPU performance. I mean this strategy, along with diversifying into graphics cards at all, is what helped it survive Intel and look how well that's going lately. Calling it bargain bin sounds pejorative, it's a very viable strategy full excellent products. Especially when people have to start rationing their power as the data centers start hogging the grid to feed their AI. No one will be interested in 6-700 watt GPUs for their computer. Ok maybe that's too far afield, but the rest is sound.
Calm down…when did “I” call AMD bargain bin? Huh?. Or was that me “predicting” what AMD haters would say??? How about reading my comment in its entirety, reflect upon its message, and then judge your competence in reading comprehension before issuing such an off the mark comment again.
 

vanadiel007

Distinguished
Oct 21, 2015
355
348
19,060
In all honesty, how much can you improve raw frame rates before diminishing returns settle in?

Let's face it, any modern high end GPU from either Nvidia or AMD can spit out enough frames to a point where extra is starting to make little sense. Especially with DLSS or FSR, frame rates are already very high and OLED monitors are more or less maxed out in terms of pixel response time.

Is there even a point in bringing out even more powerful GPU's in 2024?
 
Apr 25, 2024
36
18
35
not really, both RDNA3 and ADA are extremely overpriced with massive margins, specially the top end SKU's, AMD with a ~300mm GCD priced their card close to NVIDIA ones, and both played a price anchoring scheme to massively up the MSRP's to pandemic-mining craze level permanently, which is what we got in 2022. RDNA4 will be what RDNA3 should have been all along, a "normally" priced Gen like those before it, with small dies on an already mainstream node (N4), but i agree with you somewhat, if AMD manages to keep or slightly up perf in mainstream 200-600 cards, they could put NVIDIA in a, not tough but awkward position, as it "should" force to lower prices vs ADA prices, also mining their consumer confidence in their pricing, while keeping the perf crown and pricing the top (or 2 top) sku like a TITAN.
They could be trying to find ways to keep prices down, to hurt Nvidia's sales. Hitting the market before Nvidia, with prices that are lower than previous gen cards, is going to leave Nvidia in a tough position, if they invested too much in their next generation products to be able to sell them at reasonable prices.
 

Eximo

Titan
Ambassador
In all honesty, how much can you improve raw frame rates before diminishing returns settle in?

Let's face it, any modern high end GPU from either Nvidia or AMD can spit out enough frames to a point where extra is starting to make little sense. Especially with DLSS or FSR, frame rates are already very high and OLED monitors are more or less maxed out in terms of pixel response time.

Is there even a point in bringing out even more powerful GPU's in 2024?

You can say that about any gaming era. Increasing visual detail and fidelity is the name of the game. FPS and higher resolutions are a side effect of increased performance.

DLSS and FSR and XeSS add latency and visual artifacts. If you reach a point where you don't need these techniques, then that is one reason to keep going. Ray-tracing is another. All the fancy workarounds that most game engines use take development time to implement. Ray-tracing is actually the simpler technology, just also happens to be very resource intensive for 'real time' rendering. So improvements there are also welcome for the future.

Not sure how pixel response time factors in. As a part of overall latency, sure, but there is still all the latency to consider up to that point, which includes the monitor's scalar, compression/decompression, etc.

Still room for improvement, and they will work on it. The benefit is that those technologies eventually end up in the more mainstream products.
 
Jan 14, 2024
94
22
35
In all honesty, how much can you improve raw frame rates before diminishing returns settle in?

Let's face it, any modern high end GPU from either Nvidia or AMD can spit out enough frames to a point where extra is starting to make little sense. Especially with DLSS or FSR, frame rates are already very high and OLED monitors are more or less maxed out in terms of pixel response time.

Is there even a point in bringing out even more powerful GPU's in 2024?
yes. there is. and it's the holy grail of gaming, and that's 8K gaming
 

Silas Sanchez

Proper
Feb 2, 2024
111
64
160
The golden rule in PC gaming is you can never possible have enough graphics horsepower. Look at Path Tracing, it kills the 4090, and yet makes portal look like a completely new game. There is soo much room for improvements. To this day I have never seen any game with realistic looking flashlights or car headlights. The way the light interacts with the environments is really lacking badly. No game I have ever seen can mimic high CRI of LED, or the high CRI of 6000Kelvin Xenon short arc lighting which has a gorgeous look to it. Fort Solis kinda had a realistic headlamp color & pattern, AW2 kinda had half decent flashlights, but nothing like real life. The new GT game had really poor mediocre headlights, no racing game captures the true details of night racing with rain & elements. I know nothing about how games are made, but they just don't look much better than what we had yrs ago, Crysis had some of the most stone cold detailed night lighting in the jungle under moonlight, since then games haven't really improved that much. Looking at 4K on PC & console the textures look pretty bland, DOOM 3 with its HQ texture screens and moded texture pack gives many modern games a run for their money.
 

vanadiel007

Distinguished
Oct 21, 2015
355
348
19,060
You can say that about any gaming era. Increasing visual detail and fidelity is the name of the game. FPS and higher resolutions are a side effect of increased performance.

DLSS and FSR and XeSS add latency and visual artifacts. If you reach a point where you don't need these techniques, then that is one reason to keep going. Ray-tracing is another. All the fancy workarounds that most game engines use take development time to implement. Ray-tracing is actually the simpler technology, just also happens to be very resource intensive for 'real time' rendering. So improvements there are also welcome for the future.

Not sure how pixel response time factors in. As a part of overall latency, sure, but there is still all the latency to consider up to that point, which includes the monitor's scalar, compression/decompression, etc.

Still room for improvement, and they will work on it. The benefit is that those technologies eventually end up in the more mainstream products.

Improvement in what? I have an OLED setup and a 7900 XTX. Sure, I can game at +300 FPS but I am perfectly happy with 144 as it's super smooth, highly detailed and barely makes my system wok for it.

Fancy graphics don't make for a good game.