News When Will AMD Launch the RX 7700 and RX 7800?

I'll offer another theory: they could be stocking up the mid-range chips for FirePro cards with a lot of VRAM in order to "promote" AMD with the AI crowd. Seeing how their latest drivers specifically called out AI workloads improvements in SDiff and such, I'd imagine it's not a far fetched theory. This also includes the big strides with ROCm.

They would still sell those for more money and, probably, sell everything to the Prosumer and Professional markets before their CDNA cards are ready for workstations (IIRC).

Do I like this? Generally speaking, not really. I still have my Vega64 plowing through games just fine and the 6900XT for VR, so I'm not really bothered by current and, possibly, next gen, but I do mind "the trends" I guess.

Anyway, we'll see how this one pans out rather soon!

Regards.
 

abufrejoval

Reputable
Jun 19, 2020
409
278
5,060
I'd have said "CUDA pays the rent, gaming is a welcome windfall" until very recently.

Now I'd have to slightly modify that to PyTorch instead of CUDA.

But it doesn't actually change the results, because PyTorch on AMD isn't yet where it needs to be to be attractive.
And I really need 2-4x the VRAM on team red for the equivalent Nvidia product to switch.

But market segmentation in AI is much more profound on AMD than it is currently with team green so as ready as I am to change horses, there is nothing attractive around.

MI300 sounds nice but there is no affordable way to fit that into one of my workstations.
 
"The MCD sizes aren't exactly the same either, so we could easily be off on the Navi 32 GCD size by 5% or so"

IMO, since the Navi 32 chiplet design features the GCD in the center measuring around 200mm2, each of the four MCDs measures around 37.5mm2, so that should round up to around 350mm2.

For comparison, the Navi 31 GCD measures 304.35mm2, and the whole chip with MCDs included has an area of 529.34mm2. So we are looking at a -34% smaller die size for the Navi 32 GPU.

Also, Navi 32 should feature up to 60 Compute Units, or 3840 Stream Processors, on a 256-bit memory bus interface.
 
Last edited:

jackt

Distinguished
Feb 19, 2011
192
19
18,685
I knew that! They released the mid level cards so late that now the next gen is close ! And weeks before the ram price drop!
 

MooseMuffin

Distinguished
Oct 31, 2006
156
17
18,695
My baseless guess is the additional complexity involved in the chiplet design means they can't hit the prices they need to hit to be competitive.
 

salgado18

Distinguished
Feb 12, 2007
947
398
19,370
I get your point, Jared. But RDNA3 has the same efficiency as RDNA2 (look at 7600 vs 6600 XT), with very few new features, same ray tracing. RTX 4000 improved on every feature, brought new ones, and even almost doubled efficiency. To me, Nvidia won this round. If the Radeon cards are the same, we might as well buy what we have today (be it 6- or 7-series), or wait for the next gen. The mid-range Radeons just don't matter much anymore.
 
Have to disagree with that, I want to see the RX 7800 16GB at $500-$599, and judging by the cliff GPU sales fell off of, I'm not the only one tired of these prices.
Disagree with what? Sounds like you just agreed with wanting to see “RDNA 3 upgrades in the $400–$600 range.”

Which we probably will, given 7900 XT is selling at $780. Knock off 4GB VRAM and use a smaller GCD and 7800 for $600 (maybe less) is certainly viable. 7700 with 12GB will hopefully land at $400.
 

sherhi

Distinguished
Apr 17, 2015
80
52
18,610
I'll offer another theory: they could be stocking up the mid-range chips for FirePro cards with a lot of VRAM in order to "promote" AMD with the AI crowd. Seeing how their latest drivers specifically called out AI workloads improvements in SDiff and such, I'd imagine it's not a far fetched theory. This also includes the big strides with ROCm.

They would still sell those for more money and, probably, sell everything to the Prosumer and Professional markets before their CDNA cards are ready for workstations (IIRC).

Do I like this? Generally speaking, not really. I still have my Vega64 plowing through games just fine and the 6900XT for VR, so I'm not really bothered by current and, possibly, next gen, but I do mind "the trends" I guess.

Anyway, we'll see how this one pans out rather soon!

Regards.
Could be the case but how many times did we expect them to make some serious leap ahead of competition only to fail miserably? Ryzen 7000 was praised to oblivion, with headlines and clickbait titles on YouTube screaming "Intel is in trouble", same thing happened to 7000 GPU lineup "Nvidia is in trouble"...Maybe they are waiting with full stock and based on final performance and market reaction they will go one way or the other and most likely, sadly, fail no matter which way they go (if you don't know where you are sailing no wind is favourable).
I knew that! They released the mid level cards so late that now the next gen is close ! And weeks before the ram price drop!
In my country those cards offered interesting value, like 6700 was often cheaper than 6600XT or even 6650XT, which was sometimes cheaper than 6600XT...but does anyone know why are they doing midgen refresh while still producing regular models? Or has there been a mountain of regular models in stock up to this day? If not how is it possible to have refreshed model (for example 6650XT from may 2022) still mixed up with regular model (6600XT from August 2021) even 1 year after its release?
My baseless guess is the additional complexity involved in the chiplet design means they can't hit the prices they need to hit to be competitive.
Maybe it has nothing to do with complexity but rather finding out their architecture is not really competitive in this generation, combined with plenty of old cards available almost everywhere and Nvidia doing Nvidia things, they just play a waiting game, keeping old cards higher and earning whats left to earn. Why would they push new cards out, cannibalise their own products and god forbid maybe start a price war in duopoly? Nobody (out of those 2 relevant players) wants that.
Disagree with what? Sounds like you just agreed with wanting to see “RDNA 3 upgrades in the $400–$600 range.”

Which we probably will, given 7900 XT is selling at $780. Knock off 4GB VRAM and use a smaller GCD and 7800 for $600 (maybe less) is certainly viable. 7700 with 12GB will hopefully land at $400.
I'll offer different perspective maybe 🤔 Catcher in the rye was replaced by Harry Potter when it comes to most common book(s) children read (ask any literature teacher) and finally somewhat solid Harry Potter game is released and whoa oh my oh my, it eats even more than 12 gigs of vram, sometimes even on 1080p. Do you want 450-500€ GPU, replace the old one, maybe it fits, maybe it doesn't, maybe you have time to research all those cards I don't know...or you prefer a 500€ console to plug and play with no issues whatsoever? This Nvidia's push towards average GPU price qual to console price is a trap. 600usd (720€) for 1440p 16gb GPU? Is that supposed to be a good deal or something? :D
 

sherhi

Distinguished
Apr 17, 2015
80
52
18,610
I get your point, Jared. But RDNA3 has the same efficiency as RDNA2 (look at 7600 vs 6600 XT), with very few new features, same ray tracing. RTX 4000 improved on every feature, brought new ones, and even almost doubled efficiency. To me, Nvidia won this round. If the Radeon cards are the same, we might as well buy what we have today (be it 6- or 7-series), or wait for the next gen. The mid-range Radeons just don't matter much anymore.
Kind of true but 7600 Vs 6600 does offer reasonable performance increase, if it translates into even 7600XT Vs 6600XT then 7600XT could be interesting (if prices go waaaay down obviously).
 

Warrior24_7

Distinguished
Nov 21, 2011
31
18
18,535
Hopefully never! Hopefully they never release these cards! The market is saturated and GPUs aren’t selling well right now. Especially AMD GPUs.
 

Deer87

Distinguished
Apr 10, 2015
52
6
18,545
I get your point, Jared. But RDNA3 has the same efficiency as RDNA2 (look at 7600 vs 6600 XT), with very few new features, same ray tracing. RTX 4000 improved on every feature, brought new ones, and even almost doubled efficiency. To me, Nvidia won this round. If the Radeon cards are the same, we might as well buy what we have today (be it 6- or 7-series), or wait for the next gen. The mid-range Radeons just don't matter much anymore.
If i recall correctly the 7600 is on an older node in order to keep cost down. Can't remember the nanometer specs, but they should be closer to the 6000 series in raw chip tech terms.
The 7700 and 7800 should be more refined, so im hoping for a better efficiency, otherwise im reluctantly going to give in and buy a 4070. That 200 w mark is really impressive compared to its performance.
 
  • Like
Reactions: salgado18
If i recall correctly the 7600 is on an older node in order to keep cost down. Can't remember the nanometer specs, but they should be closer to the 6000 series in raw chip tech terms.
The 7700 and 7800 should be more refined, so im hoping for a better efficiency, otherwise im reluctantly going to give in and buy a 4070. That 200 w mark is really impressive compared to its performance.
Gains from TSMC N5, yes. Losses from chiplet architecture, also yes. That's my guess.
 
  • Like
Reactions: Deer87
D

Deleted member 2947362

Guest
The way things are and if it last to long they might be forced in to re-branding some of the stock I'm not sure even if you did released a killer consumer grade card right now would help much looking at the bigger picture.

Those who have been sitting on the fence would jump on them like lightening but I think the down turn is a mixed bag of peoples finances and everyday cost of living so anything to do with upgrading any part of their PC is very very low on the list.

Along with those who have already upgraded and know it's all they need at this time, It's nice to upgrade to the latest tech but if what you have works and does what you need then you don't Need or Have to to upgrade, even more so if you were burnt with price of the card when you bought it.

Of cause there will always be a % of people where money is not the issue.

I think it's going to take longer to recover from financial struggles for some but I don't know I'm no expert on these matters, I just go by looking at whats going on in the world, and times look hard for many.

But this just an opinion from someone who come's from the lower end of the financial rat race, so I have no Idea really lol.
 

T1125P

Commendable
Jul 10, 2021
6
2
1,515
The issue is all 3 GPU companies are not releasing what we want, so they can shove what they have up their asses 🤣 Nvidia for making a 4070 with 12GB instead of 16, Intel for not having anything that can at least compete with a 4060/4070. Now AMD, for constantly pushing their older RDNA2 GPUs, no we don't want that crap. If they released a 7700xt/7800xt then people would buy, at reasonable prices. They all just don't get it. I'm still waiting for a 7800XT and retire my 3070 and cut ties with those greedy assholes at Nvidia.
 
  • Like
Reactions: Deer87

Deer87

Distinguished
Apr 10, 2015
52
6
18,545
The issue is all 3 GPU companies are not releasing what we want, so they can shove what they have up their asses 🤣 Nvidia for making a 4070 with 12GB instead of 16, Intel for not having anything that can at least compete with a 4060/4070. Now AMD, for constantly pushing their older RDNA2 GPUs, no we don't want that crap. If they released a 7700xt/7800xt then people would buy, at reasonable prices. They all just don't get it. I'm still waiting for a 7800XT and retire my 3070 and cut ties with those greedy assholes at Nvidia.
Yeah. That said, if the 4070 had cost around 450-500 i would have been all over it.
 

T1125P

Commendable
Jul 10, 2021
6
2
1,515
Yeah. That said, if the 4070 had cost around 450-500 i would have been all over it.
I might have bought it to at that price but that 12GB is what's stopping me. Plus I'm trying to leave Nvidia 😁 16GB at $499-$549 would be have been sold out. I'm sure Nvidia knows that.
 
  • Like
Reactions: Deer87

Deer87

Distinguished
Apr 10, 2015
52
6
18,545
I might have bought it to at that price but that 12GB is what's stopping me. Plus I'm trying to leave Nvidia 😁 16GB at $499-$549 would be have been sold out. I'm sure Nvidia knows that.
Same here. I have a 3060, but have promised myself an upgrade now that the shortage is over. Im waiting for the 7700 to drop, but I've gotta admit, If AMD can't deliver, I'm kinda jaded at this point. Plus, I have a Node 304 case I wanna get back into, so my options are a bit limited, and that Inno3D 4070 is a really sleek looking 2-slot option.
 

jackt

Distinguished
Feb 19, 2011
192
19
18,685
In my country those cards offered interesting value, like 6700 was often cheaper than 6600XT or even 6650XT, which was sometimes cheaper than 6600XT...but does anyone know why are they doing midgen refresh while still producing regular models? Or has there been a mountain of regular models in stock up to this day? If not how is it possible to have refreshed model (for example 6650XT from may 2022) still mixed up with regular model (6600XT from August 2021) even 1 year after its release?
Easy, the regular 6600 was a mistake, for the timing, the pricing, the ram, the nm, the unwanted AI garbage. Now they try to fix the mistakes...

Ofcorse they now can make new mistakes...