News AMD RDNA 4 Radeon RX 9000-series GPUs revealed: Targeting mainstream price and performance with improved AI and ray tracing

Keeping up with AMD's branding is exhausting. HD 7xxx series then R9 2xx series then RX 5xxx series now RX 90xx series. Now it's also going to be easy to confuse AMD series with Nvidia.
 
Who will end up with the best graphics cards when the dust settles?

nVidia will end up with the best graphics cards simply because AMD and Intel are not competing at the high end of the market. The question should be more about who is going to offer the best value for the dollar at the 1080P, the 1440P and 4K resolutions. There, there is competition.
 
  • Like
Reactions: artk2219
“improved” ray tracing
Improved? It should be ground-breaking! Supercharged! Revolutionary! They needed to double or triple it to catch up! Just "improved" means their cards will still look worse than Nvidia? Are they hoping that raytracing just dies off because of the lack of support, like PhysX? I'm very concerned now.
 
  • Like
Reactions: artk2219
Well, that was a lackluster announcement. We already knew AMD cards were coming soon. Any excitement I had to see what their R&D department has been up to was quickly squashed. "Announcing our new cards!! They're faster and better than last time! We hope you buy, thanks for watching!"

Hopefully they can give us some real-world details before CES is over. Maybe they're just waiting for Nvidia to have their announcement before they throw out some numbers and details about what's changed and why we might be interested in their cards. With Path tracing showing off what it can do in several titles, I really want to see more major games move this way. Simple "improved ray tracing" doesn't strike much confidence here. On RDNA3, they also announced improved ray tracing, and it was better, but still lagged behind compared to Nvidia's previous 3000 series equivalents.
 
Last edited:
  • Like
Reactions: artk2219
Improved? It should be ground-breaking! Supercharged! Revolutionary! They needed to double or triple it to catch up! Just "improved" means their cards will still look worse than Nvidia? Are they hoping that raytracing just dies off because of the lack of support, like PhysX? I'm very concerned now.

The 7900xtx was able to match the 3090ti in RT. Considering it is also only their second gen of RT capable cards, that is pretty impressive. Not to mention the R&D budget difference between the two.

https://www.techpowerup.com/review/amd-radeon-rx-7900-xtx/34.html
 
Who will end up with the best graphics cards when the dust settles?

nVidia will end up with the best graphics cards simply because AMD and Intel are not competing at the high end of the market. The question should be more about who is going to offer the best value for the dollar at the 1080P, the 1440P and 4K resolutions. There, there is competition.
When we say “best” we don’t mean “fastest” — we mean all those other elements like value, efficiency, and features. Without pricing and performance data, we don’t have much to go on.
 
Considering it is also only their second gen of RT capable cards, that is pretty impressive.
Not really. Intel's first generation RT matched the equivalent 30 series and that architecture is a mess. AMD simply didn't dedicate enough die space to RT and now they likely will. I'd be surprised if AMD didn't match nvidia in RT performance with these RDNA 4 based cards.
 
Improved? It should be ground-breaking! Supercharged! Revolutionary! They needed to double or triple it to catch up! Just "improved" means their cards will still look worse than Nvidia? Are they hoping that raytracing just dies off because of the lack of support, like PhysX? I'm very concerned now.
Nah, probably 25% better or so, meaning it likely matches the cards its paired up against, namely the RTX 4070 ti - RTX 4080 non super for now, so "improved" is apt. If Nvidia doesn't have a massive increase in ray tracing performance on the mid tier cards this generation, then it should put them roughly on par. This would mean a Radeon 9070 should perform similarly to an RTX 5070, and if they double down on the pricing, then they may have a chance at getting some market share this generation. This means it cant be more than 5 - 600 dollars, preferably closer to 500, we will see how it turns out soon enough though.
 
The lack of details on this "announcement" really makes me question the strategy. It certainly doesn't instill confidence in AMD's release. I thought the target midrange strategy was a play for volume and this seemed like a good idea given how bad the last two generations have played out. Now I can't help but wonder if we're in the same situation as the last two generations where AMD is waiting for nvidia to set the price/perf and then slightly undercut them. The alternative would be that the cards aren't really ready/finalized and that would be a really bad thing.
 
The 7900xtx was able to match the 3090ti in RT. Considering it is also only their second gen of RT capable cards, that is pretty impressive. Not to mention the R&D budget difference between the two.

https://www.techpowerup.com/review/amd-radeon-rx-7900-xtx/34.html
(I never understood why people refer to other review sites on the Tom's Hardware forum)

https://www.tomshardware.com/reviews/amd-radeon-rx-7900-xtx-and-xt-review-shooting-for-the-top/4

The 7900 XTX is a lot faster than the 3090ti in raster, but a lot slower in RT.
The 3090ti lost 55% of performance, the 4080 lost 52%, and the 7900 lost 66%.

And that's the top tier card from each. Go lower, and RT becomes unplayable on Radeons, and usable on Geforces.

Don't get me wrong, I'm an AMD fan. I only buy AMD for myself. But their GPUs are really lacking in RT for a long time, which costs them precious sales, which gives them less money for R&D, and further puts them behind.
 
(I never understood why people refer to other review sites on the Tom's Hardware forum)

https://www.tomshardware.com/reviews/amd-radeon-rx-7900-xtx-and-xt-review-shooting-for-the-top/4

The 7900 XTX is a lot faster than the 3090ti in raster, but a lot slower in RT.
The 3090ti lost 55% of performance, the 4080 lost 52%, and the 7900 lost 66%.

And that's the top tier card from each. Go lower, and RT becomes unplayable on Radeons, and usable on Geforces.

Don't get me wrong, I'm an AMD fan. I only buy AMD for myself. But their GPUs are really lacking in RT for a long time, which costs them precious sales, which gives them less money for R&D, and further puts them behind.

Because we can refer to other sites. Looking at only one source does not always give you the big picture. Techspot/HUB had the 7900xtx trading blows with the RTX 3090ti with RT on. The point is, AMD has a smaller budget, and has been doing RT for less time, and managed to make their 2nd gen RT competitive with Nvidia's 2nd gen. That is no small feat.
 
laurel resting is what your seeing...

No real RX 9950XTX RDNA4 flagship... instead a lower competitor that copies nVidia naming...
No real R9 9959X3D Zen5 flagship... as they didn't utilize the cache for all 16 cores....

these are care taker generations.

So now we put off major purchases until RDNA5 and Zen6? just teasing.... but lack luster R9 and RX offerings for Zen5 and RDNA4 have me considering the integrated Stict Halo Ryzen AI 395 Max+ Pro for it's Zen5/RDNA3.5 implementation.
 
Because we can refer to other sites. Looking at only one source does not always give you the big picture. Techspot/HUB had the 7900xtx trading blows with the RTX 3090ti with RT on. The point is, AMD has a smaller budget, and has been doing RT for less time, and managed to make their 2nd gen RT competitive with Nvidia's 2nd gen. That is no small feat.
I don't really care if people want to link other sites, but you have to know what you're linking. There's a very wide spread of "RT game" performance data.

At one end are games that hardly benefit from RT, where you could turn it off and not really notice the difference in shadows. AMD does pretty well there. In the middle are games like Cyberpunk 2077 (RT Ultra) and Control that do more RT effects, and AMD starts to lag behind. And then at the extreme are the full RT games where AMD does very poorly.

Alan Wake 2, Black Myth Wukong, Cyberpunk 2077 (2.0 and later), and Minecraft all use "full RT," and the 7900 XTX does quite poorly in all of them. Basically, it tends to trail the RTX 3080 10GB in all those games, sometimes by quite a lot. So, with 41% more RT units (Ray Accelerators vs. RT Cores), and ~40% higher clock speeds, it's delivering slightly lower overall throughput.

Based on that, it looks like AMD's 2nd generation RT hardware is more in line with Nvidia's 1st generation RT hardware — at least in terms of RT calculations per core per clock. For less demanding RT games, AMD looks competitive, but those are the games where I'm not even sure people would generally enable RT. Actually, I'm not sure how many people (who don't have a 4090 or at least a 4080) play with RT enabled. That would be an interesting statistic to get.
 
So, they announced a whole lot of nothing.......
More worrying is that they don’t even dare release performance numbers relative to Ada, and that no expected pricing smells like they are hard stuck on going “$100 below NVIDIA same number class GPU (5070ti)”…

Sounds like they managed to stick to screwing up their launch according to tradition. Hopefully it’s all paranoid and I will be proven wrong in a few weeks