News AMD explains the missing RDNA 4 GPUs in its CES 2025 livestream

"Azor also noted that the company didn't want to feed a narrative that it doesn't care about RDNA 4 because it only spent five minutes on the topic".

I don't know if these guys even realise it, but they 're feeding this very narrative, by not mentioning RDNA4 at all.
 
How dumb does their PR team think people are?

McAfee explained that the keynote was only 45 minutes long, and with a slew of other products to announce, the company simply couldn't dedicate enough time in the presentation to the new GPUs to correctly frame the product.

Really? Someone elsewhere mentioned that the word AI was used 153 times in the keynote. No one tunes into a CES keynote because they wanted to hear about AI. No one. People watch major trade show keynotes to see big product announcements. There hasn't been a new GPU architecture released in over 2 years, that's what people were wanting to see. The enthusiast community does not care about the millionth AI enabled CPU announced over those 2 years.
 
How dumb does their PR team think people are?



Really? Someone elsewhere mentioned that the word AI was used 153 times in the keynote. No one tunes into a CES keynote because they wanted to hear about AI. No one. People watch major trade show keynotes to see big product announcements. There hasn't been a new GPU architecture released in over 2 years, that's what people were wanting to see. The enthusiast community does not care about the millionth AI enabled CPU announced over those 2 years.
well, likely it's the not so techy boss of the marketing department: AI is everything hot in the market now, we will focus solely on AI, that mid range GPU? forget it, it won't sell and dominate...

this sort of behavior is kinda everywhere like crypto and NFT a few years ago... major firms join the party very late with everything marketed towards it and... losing money...
 
  • Like
Reactions: artk2219
I watched a video by hardware unboxed, the guy on there says they really want to get the launch right on the gpus. I also noticed the nvidia keynote was later than amd. I kind of wonder if the short piece about their gpus was to say hey we have them but we want to see what nvidia unveils before we show our hand. If amd had come out with too his a price for example, maybe nvidia would have changed something at the last minute? So I suppose it gives amd room to wait and then they can do a bigger presentation on just the graphics if they want.

One interesting thing, the guy on the hardware unboxed video did say they had talked to an amd exec in a Q&A session and the exec apparently said all of the performance leaks are wrong and that board partners don’t even have drivers yet. So who knows where things will actually land.
 
  • Like
Reactions: artk2219
Nvidia announced pricing for their halo cards - 5080/90 which AMD has already stated they aren't going to compete with. There won't be $1200 AMD cards this generation.
uhhh...?

37c37813ce4de1decffe57cbf4914b90daecac180d50a4c97e8d4f0d5abee79c.jpg
 
  • Like
Reactions: artk2219 and Why_Me
Nvidia announced pricing for their halo cards - 5080/90 which AMD has already stated they aren't going to compete with. There won't be $1200 AMD cards this generation.
There won’t be $500 AMD cards this gen.
If the 5070 is on par with a 4090 at $549 AMD is in some serious trouble. They have nothing including RDNA 4 that can match a 4090.

I see some serious price drops on all previous gen cards coming as well.
 
  • Like
Reactions: artk2219
That's a lie of omission like those frame generation slides from the 40 series lower SKU launch. It sounds like the new DLSS frame generation is 50 series exclusive so that would be how the 5070 "matches" the 4090.

https://www.nvidia.com/en-us/geforce/news/dlss4-multi-frame-generation-ai-innovations/
You're right except for the part about lying by omission. Right after Jensen said the 5070 was as fast as the 4090, he said that was only possible because of AI. So it was pretty clear they were using some sort of DLSS to to make that claim possible.

That said, the prices still look pretty reasonable. 5090 is the lowest price I thought possible. Someone in another thread was hypothesizing a $1500 5080. Looks like we'll get legit 4090 performance for $1000+ AIB markup.
 
Tbh, this makes perfect sense to me. 🤷 AMD didn't want a repeat of RDNA 1/RX 5000 where they had to fire first and then pull a "jebait" price cut in response to Nvidia's RTX 2000 Super pricing. AMD isn't the market leader. They don't set general pricing conventions. Nvidia does. Thus, it's INFINITELY better from a business point of view to wait until AFTER Nvidia confirms Blackwell performance & pricing before they set their own RDNA 4 pricing in accordance.

Otherwise AMD simply can't guarantee they'll win in price/performance (or at least win by ENOUGH), which is RDNA 4's entire reason to exist.

Azor's explanation might sound slimey and ridiculous at first glance, but if you actually think it through it's just basic common sense strategy for a company who isn't the current market leader.
 
Massively dissappointing. Radeon will never miss the chance to miss a chance. 5m is wayyy more than enough for that presentation
 
NVidia announced by omission that ONCE AGAIN, CARDS ARE NO FASTER THAN BEFORE. But the gimmicks that allow Nvidia buyers to have bragging rights over their fake pixels and fake frames are up over 87%! So whoop-de-do! There's a reason why the proces went down : less is LESS. If you believe that Nvidia 5000 cards provide value for money, go ahead and buy this tomfoolery, a fool and their money are soon parted ...
 
  • Like
Reactions: YSCCC
NVidia announced by omission that ONCE AGAIN, CARDS ARE NO FASTER THAN BEFORE. But the gimmicks that allow Nvidia buyers to have bragging rights over their fake pixels and fake frames are up over 87%! So whoop-de-do! There's a reason why the proces went down : less is LESS. If you believe that Nvidia 5000 cards provide value for money, go ahead and buy this tomfoolery, a fool and their money are soon parted ...
You are completely right, of course! When you only get more fake pixels instead of real ones, that can't be worth the extra money, right?

Except, that when those fake pixels look good enough in the heat of a game yet cost half the hardware to produce, paying a little extra money vs "more real pixels" seems good bang for the buck.

And then perhaps you realize that all of these real pixels are really also fake, just created in a different manner. It's an artificial world that is being projected on your screen, currently built from triangles, very unlike most people, monsters or landscapes and only painted and bumped with ever more sophisticated shaders.

They have helped build that illusion and thus perhaps understand better, that it is only an approximation for which they experimented with plenty of other rendering approaches before most of the industry settled on the current variant, which has carried very far but is now hitting scaling and realism limits.

Remember that Nvidia's hinted long term direction is to move entirely away from those shady triangles and towards an "Illusion Processing Unit" which renders a game at practically infinate resolution or color depth using generational AI from a scene description. And a lot more processing power would go into characters or a far more dynamic "realistically faked" environment than today, where the need to have those those triangles be designed by a human digital artist really limits what you can offer.

They aren't really just trying to find ever more clever ways to cheat gamers by pinching on hardware, they are trying to transform how the fantasy (or digital twin) world on you screen is actually produced.

And that is a race that nobody who doesn't understand the direction they are going is going to be able to catch up to...
...unless Nvidia's approach turns out fundamentally flawed.

So far they take baby steps and what they show for each works amazingly well.

And the biggest advantage of their approach is that it allows them to break the inverse square (or root) resolution returns for linear improvments on processing power: 4k raster can be done now, but 8k means 4x the effort... unless you have tensor based AIs paint, not GPUs render.