News AMD may unveil next-gen Radeon RX 8000-series GPUs at CES 2025 according to leaker

D

Deleted member 2731765

Guest
Leaker says the first RDNA 4-based GPUs set to be announced at CES 2025.

Don't trust any leaker. He just posted for click-bait. There is no news on any firm release date yet, and this is all speculation for now. And you yourself agree that a CES 2015 2025 release "makes little commercial sense".

Amen.

For AMD's RDNA 3-series, a shader engine includes 16 compute units containing 1024 stream processors (capable of 2048 FP32 FMA or 4096 FP32 operations). If AMD's RDNA 4 maintains this structure for shader engines and compute units, and the rumors about four shader engines mentioned, Navi 48 will have 4096 stream processors.

At this stage guessing any performance metrics is just a shot in the dark.

AMD's strategy with its RDNA 4 GPU architecture is very different than RDNA 2 or RDNA 3. Both previous gen RDNA 2 and RDNA 3 graphics architectures had SKUs spanning the high-end to the entry-level segment.

This doesn't seem to be the case with RDNA 4 as its top GPU die was canceled in favor of mid-tier positioned chips. AMD themselves confirmed they aren't focusing on the enthusiast and high-end GPU market segment.

According to some early patches, AMD's "GFX12" family which is the internal codename for the RDNA 4 GPU lineup WAS reportedly going to feature up to 9 Shader Engines on the flagship SKU.

This would have given a 50% uplift in shader engines/SE count over the Navi 31 chip which comes with 6 Shader Engines. But sadly, this isn't the case anymore.

For reference: Navi 31 features a total of 16 compute units (Dual Compute Units per WGP) per Shader Engine for a total of 96 compute units.


HGGJfj5.png
 
Last edited by a moderator:

TechyIT223

Prominent
BANNED
Jun 30, 2023
277
66
760
Based on leaks so far top Rdan4 is 6800xt+ with better rt.

Amd is launching same gpu for last 5+ years for same perforamance and likely same price.

6800xt - $500+
7800xt- $500+
8800xt- $500+
 

DS426

Upstanding
May 15, 2024
254
189
360
Hard to imagine they'll miss the holiday season, even if market availability isn't until December 15th or something crazy like that (RDNA3 with 7900 XTX and XT were launched on December 13th if you recall). Obviously, a launch prior to Black Friday is ideal, but I'd certainly rather see AMD launch RDNA4 later and get it right rather than rush it out the door.

While I'd love to see high-end parts, indeed most gamers don't exist in that part of the market. AMD definitely needs to claw back some market share here as now nVidia is wildly at 88% discrete GPU market share, granted I'm sure there are some caveats to that exact figure.
 
  • Like
Reactions: Jagar123
Don't trust any leaker. He just posted for click-bait. There is no news on any firm release date yet, and this is all speculation for now. And you yourself agree that a CES 2015 release "makes little commercial sense".

Amen.
I'm curious as to why the RX8000 series announcement wouldn't make sense at THE Consumer Electronics Show event? Is there historical precedent of neither AMD, Intel or nVidia not announcing things in there?

Also, you mean CES2025, right?

Regards.
 
D

Deleted member 2731765

Guest
Also, you mean CES2025, right?

Of course, typo. fixed.


I'm curious as to why the RX8000 series announcement wouldn't make sense at THE Consumer Electronics Show event? Is there historical precedent of neither AMD, Intel or nVidia not announcing things in there?

No, You misunderstood me there. Wait, I will explain more clearly in my next post. I was just implying that any release date info is all pure speculation for now.
 
Last edited by a moderator:
  • Like
Reactions: -Fran-
People still belive will be 500us lol
If the launch is 2025 + Nvidia will eat Amd
How would that be any different than what has already been going on for years? The path for Radeon to be a success is very simple and obvious but AMD, for some reason, won't take it. This is especially ridiculous because it would be the exact same tactic that AMD used to make Ryzen surpass Intel Core. Even if they couldn't completely supplant GeForce, they'd at least triple the marketshare that they currently have.

"Stupid is as stupid does."
- Forrest Gump, 1994
 

jlake3

Distinguished
Jul 9, 2014
135
199
18,760
The path for Radeon to be a success is very simple and obvious but AMD, for some reason, won't take it. This is especially ridiculous because it would be the exact same tactic that AMD used to make Ryzen surpass Intel Core.
Care to explain what that "very simple and obvious" path to success is?

Intel got complacent on core counts while simultaneously stuck on an aging process node, AMD bet the company on a new microarchitecture and chiplets and was thus poised to shake up the status quo, and both of those events happened to align.

And depending on where you check and who you ask, Intel Core still outsells Ryzen by at least 2-to-1, if not 3-to-1.

Nvidia is using the same cutting-edge TSMC nodes as AMD and has huge brand loyalty and software lock-in and is by all appearances on top of their game, making them seeminly much less vulnerable than Intel was.
 

Notton

Commendable
Dec 29, 2023
859
754
1,260
Care to explain what that "very simple and obvious" path to success is?

Intel got complacent on core counts while simultaneously stuck on an aging process node, AMD bet the company on a new microarchitecture and chiplets and was thus poised to shake up the status quo, and both of those events happened to align.

And depending on where you check and who you ask, Intel Core still outsells Ryzen by at least 2-to-1, if not 3-to-1.

Nvidia is using the same cutting-edge TSMC nodes as AMD and has huge brand loyalty and software lock-in and is by all appearances on top of their game, making them seeminly much less vulnerable than Intel was.
Almost everyone understands it is price.
The 30% price/ performance (raster) pricing model that AMD uses is not convincing enough.
In 2024, it is pretty clear people want Ray-Tracing and image quality.
When you look at FSR3.x and RT on AMD, even US$450 for a 7800XT feels like too much.
I bet a 7800XT would sell like hotcakes if it was priced at $350.
 

TechyIT223

Prominent
BANNED
Jun 30, 2023
277
66
760
Based on the leaks so far, leakers where always wrong beside for AdoreTV with Zen 2.

I was talking about the MSRP values of AMD's previous gen ryzen chips BTW. These were actually correctly predicted by some users though.

But yeah leaker predictions are always a hit or miss. Most of the leaker predictions are just for publicity stunt. They wanna grab some attention.
 

KnightShadey

Reputable
Sep 16, 2020
147
88
4,670
Care to explain what that "very simple and obvious" path to success is?

Almost everyone understands it is price.

I doubt that it's price.

(Edit: well looking at his comment in the Brand recognition thread I guess it is solely based on price. Oh, well. 🤷)
Agree to disagree...
Edit2: After his follow-up, looks it wasn't just based on price... so back to agree. 🤣


Price only gets you somewhere if cost is equally favourable, and it is unlikely to get you to upset other markets like Threadripper, Epyc and MI series has vs intel.

Chiplets vs Monolithic is the past path to success, and likely the best strategy vs nVidia also, with AMD leveraging their experience in the area vs their deficit at the top end of single chip solutions.

Chiplets can compete well with monolithic solutions on performance, they just need proper design & optimizations, which is their biggest drawback, without that work they can't compete beyond 1=1.

Where chiplets have great advantages vs monolithic chips is that they have larger quantity at a lower defect rate per wafer by default so their cost of production per pixel/flop/token/etc so success in combining them can make a cheaper competitive (or even superior part). Additionally a more modular solution allows you to position that production towards multiple skus, further compounding the benefits of cost reductions & economies of scale. THAT then might allow you to price a 1=1 variant below the competition because your costs on the same fab/node are spread out over more skus (and hopefully more chips if equally attractive to customers.

The iGPU situation is also re-defining the middle ground too, it's another area that AMD needs to and is focusing on, and could be an area that nV can't follow unless ARM makes more of an impact. Those options undermine the profitable low-end.

Pricing alone is more a reaction than a strategy, without directly affecting costs it's hard to beat the competition, and the only way to beat them at the top is by changing the options, and the way to beat them overall is to change equations of what makes up those options.

Of course for both AMD and nV, the GPU market is more of a testing ground, since the vast majority of money and focus is elsewhere, on products that benefit from the developments in this segment. So still important, just not as important as before.
 
Last edited:

KnightShadey

Reputable
Sep 16, 2020
147
88
4,670
I'm curious as to why the RX8000 series announcement wouldn't make sense at THE Consumer Electronics Show event? Is there historical precedent of neither AMD, Intel or nVidia not announcing things in there?

Well it's not that easy, with nVidia and Samaung usually opening CES on Press day or Day 1 as the early 'main attractions' Intel and AMD usually have something later. All of them are pretty fun to attend, and often have entertaining guests (oddest one for me was still Sarah Silverman for Cisco's Smart IoT launch [got a huge coffee-table IoT book out of that though]).

Should AMD speak before nVidia that would be somewhat unusual, but it would be the desired spot when announcing something 'less than' like nV at Computex before the Ai PC launches, etc.

The problem with CES is that any product launch by the traditional sources, no matter how 'game changing', are very likely already well known to the fans via leaks, and even more likely to lose wider media attention to the WiFi bluetooth BlockChain IoT ... Ai-enabled bra, spork, or toilet, that may never see the light of day. I remember how both Audi and the BMW stole media attention with self-parking cars 2 years apart over a decade ago, and this tech still isn't at the same level demo'ed (promising leave your car at mall entrance, while it goes finds spot in parkade, and the return to you at the exit when ready to leave),

I don't doubt that AMD will do a wide public launch (even paper one if necessary) at CES, but it's likely as part of a wider overall group of products like Strix Halo [hopefully not that late for either] or Fire Range, etc, or X1 refresh). It would be the perfect part to accompany a PS5Pro refresh (and Sony is another usual suspect early Keynote presenter) , but the Holiday-buying situation would make it look more of a miss than a win. Alone it's unlikely to be just an RDNA4 launch, but as part of a bunch of products or as the anoounementmof the family of peoducts it makes sense.
Especially since the high-end is less dependent on holiday purchases and you would want less volume pressure on rarer new parts, whereas a mid-range follow-up by summer-time makes sense once volume is achieved, and entry-mid level volume for Xmas.

That's my guess, based on experience, but still a dart-board guess at this point.
 

jlake3

Distinguished
Jul 9, 2014
135
199
18,760
Almost everyone understands it is price.
The 30% price/ performance (raster) pricing model that AMD uses is not convincing enough.
In 2024, it is pretty clear people want Ray-Tracing and image quality.
When you look at FSR3.x and RT on AMD, even US$450 for a 7800XT feels like too much.
I bet a 7800XT would sell like hotcakes if it was priced at $350.
"We lose money on every unit, but we make up for it in volume!"

That's what I kinda assumed that the "very simple and obvious" answer was going to be; discount by any amount necessary to make the market share number go up. While I don't know what the BoM cost of a 7800XT is, based on AMD's latest investor statements it looks like their gaming division doesn't have the kind of margins to lop 20%+ off the price of their cards and not be operating in the red.
 
  • Like
Reactions: Thunder64

Notton

Commendable
Dec 29, 2023
859
754
1,260
"We lose money on every unit, but we make up for it in volume!"

That's what I kinda assumed that the "very simple and obvious" answer was going to be; discount by any amount necessary to make the market share number go up. While I don't know what the BoM cost of a 7800XT is, based on AMD's latest investor statements it looks like their gaming division doesn't have the kind of margins to lop 20%+ off the price of their cards and not be operating in the red.
The thing is though, AMD doesn't have anything else to offer.

FSR3.1 vs DLSS?
DLSS is clearly better.
In fact, XeSS looks better than FSR3.1

Ray-Tracing?
AMD is at least one generation behind in performance.

AMD's power consumption is worse on chiplet GPUs.
 

salgado18

Distinguished
Feb 12, 2007
977
434
19,370
The thing is though, AMD doesn't have anything else to offer.

FSR3.1 vs DLSS?
DLSS is clearly better.
In fact, XeSS looks better than FSR3.1

Ray-Tracing?
AMD is at least one generation behind in performance.

AMD's power consumption is worse on chiplet GPUs.
FSR vs DLSS, apart from specific features like ray reconstruction, is not that relevant.

AMD hasn't caught up on Nvidia's 3000 series in raytracing.

Power consumption is a good thing to have, but a bad one is manageable.

What they need is to at least get close in raytracing (maybe leveraging chiplets to make it happen), keep up with the good memory offerings (a weak point in Nvidia), and sell slightly cheaper. That will keep them in the game against not only Nvidia but also Intel, or else they'll end up sinking hard on the GPU sector.
 

TechyIT223

Prominent
BANNED
Jun 30, 2023
277
66
760
Ray tracing is the least of my concern though.

I want just want a pure rasterized gaming performance GPU. Can next gen rdna4 provide a better price vs performance ratio than both Intel battlemage and RTX 50 gen cards ?
 

Giroro

Splendid
If AMD were smart, they would be pumping every single R&D dollar into making ROCm even slightly functional/useful on PC, and much better on Linux - because clearly the open source AI community isn't very interested in spending years doing that work for free when they can just buy Nvidia and be up and running with at least 3x the performance, in a day.
 

Notton

Commendable
Dec 29, 2023
859
754
1,260
Ray tracing is the least of my concern though.

I want just want a pure rasterized gaming performance GPU. Can next gen rdna4 provide a better price vs performance ratio than both Intel battlemage and RTX 50 gen cards ?
Congratulations, you are in the minority. If you want pure raster, buy Radeon all you want.
FSR vs DLSS, apart from specific features like ray reconstruction, is not that relevant.

AMD hasn't caught up on Nvidia's 3000 series in raytracing.

Power consumption is a good thing to have, but a bad one is manageable.

What they need is to at least get close in raytracing (maybe leveraging chiplets to make it happen), keep up with the good memory offerings (a weak point in Nvidia), and sell slightly cheaper. That will keep them in the game against not only Nvidia but also Intel, or else they'll end up sinking hard on the GPU sector.
That's the thing though, FSR vs. DLSS is hella relevant in 2024 when a majority want to play with RT on. DLSS and frame-gen is not the joke it was at launch. They are immensely helpful in improving both image quality and frame rate in the games that support it.

Where as FSR is: Ghosting ghosting ghosting and more ghosting because it's doesn't handle motion properly for some reason.
 
D

Deleted member 2731765

Guest
Well, it's gonna be the war between upscalers more like then !

Speaking of which, AMD just announced the FidelityFX SDK v1.1 which adds FSR 3.1 support along with decoupled frame-gen to work with NVIDIA DLSS & Intel XeSS. 🙄

AMD's FSR 3 Frame Generation is now decoupled from FSR upscaling which means that you are no longer required to use FSR upscaling to have frame generation work.

The new "Decoupled" approach will allow FSR frame generation to work with third-party solutions such as NVIDIA's DLSS and Intel's XeSS.

AMD also released a new Global Illumination and Ambient Occulsion suite known as Brixelizer, which aims to be an alternative to hardware-accelerated ray tracing for lower-end GPUs.

 
Last edited by a moderator: