News AMD: RDNA 4 coming in early 2025, set to deliver ray tracing improvements, AI capabilities

YSCCC

Commendable
Dec 10, 2022
565
459
1,260
AMD's chips would be more competitive if they would just simply make them bigger.
Cheaper maybe...

TBF if the 7900XT and XTX were not priced so close to Nvidia competition it will be more competitive, their issue is eating up more power, with a drawback in RT and yet not significantly cheaper, so that when someone wants to buy a GPU, the price difference didn't justify the lack in game optimisation and RT
 

Notton

Commendable
Dec 29, 2023
859
754
1,260
AMD's chips would be more competitive if they would just simply make them bigger.
I doubt that.
RDNA3 isn't selling because it's too expensive for it's judged value. They have good raster, but most buyers in 2024 want ray-tracing.
FSR+AFMF was a disaster until this year, but then nvidia already answered it with a smooth working DLSS+framegen, on top of ray-tracing.

Hopefully AMD finally learns its lesson and prices the 8800XT aggressively for its feature set. When you're behind, merely catching up and offering the same price isn't going to pursuade buyers.
 
Cheaper maybe...

TBF if the 7900XT and XTX were not priced so close to Nvidia competition it will be more competitive, their issue is eating up more power, with a drawback in RT and yet not significantly cheaper, so that when someone wants to buy a GPU, the price difference didn't justify the lack in game optimisation and RT
Depends where you are. As someone who has just bought a 7900XT for 1440p gaming in the UK, price was a major factor. According to the Toms 1440p Ultra chart, the 7900XT falls above a 4070ti Super and below a 4080. The '70 has lower RAM, lower raster and costs more. The '80 is better all round and costs a *lot* more. It was a no-brainer for me.
 

ottonis

Reputable
Jun 10, 2020
220
190
4,760
Cheaper maybe...

TBF if the 7900XT and XTX were not priced so close to Nvidia competition it will be more competitive, their issue is eating up more power, with a drawback in RT and yet not significantly cheaper, so that when someone wants to buy a GPU, the price difference didn't justify the lack in game optimisation and RT
I wonder how much of a difference RT does indeed make nowadays. Is there a substantial body of (AAA)games that rely on RT, and do these do indeed look considerably worse with fewer/lesser RT effects?
I think that all this RT stuff is more marketing than a real-world issue, because the games that I have seen do look only marginally better with RT when compared to rasterization.
 
  • Like
Reactions: TesseractOrion

salgado18

Distinguished
Feb 12, 2007
977
434
19,370
I wonder how much of a difference RT does indeed make nowadays. Is there a substantial body of (AAA)games that rely on RT, and do these do indeed look considerably worse with fewer/lesser RT effects?
I think that all this RT stuff is more marketing than a real-world issue, because the games that I have seen do look only marginally better with RT when compared to rasterization.
It is mostly marketing, but marketing makes public perception and public perception makes demand. AMD lost the GPU battle because of RT and DLSS, even with better raster perf/price. Consumers view Radeons as inferior because they Geforces have the latest tech and the other don't.

Doubling RT is good, but Nvidia already doubled a couple of times, so to be really competitive they should at least triple RT performance. Also add some AI power to it, and develop FSR 4 to take advantage of all of it.

It doesn't look good, but I trust AMD's ability to at least keep in the game.
 

YSCCC

Commendable
Dec 10, 2022
565
459
1,260
I wonder how much of a difference RT does indeed make nowadays. Is there a substantial body of (AAA)games that rely on RT, and do these do indeed look considerably worse with fewer/lesser RT effects?
I think that all this RT stuff is more marketing than a real-world issue, because the games that I have seen do look only marginally better with RT when compared to rasterization.
I was tempted by the XTX also when released, but tbf, it is faster in raster than 4080 in raster but much slower in RT and lost the cuda core support for stuffs like PS or video en/transcoding. The price difference was $200, but $200 compared at $1000+ isn't as significant as in the lower tier products, which makes it much less appealing when games traditionally optimise more for Nv.

Somehow I have a feeling this is more consumer feeling issue, with more and more games supporting RT for "best effects", paying upwards of $700 to have unusable performance at all altest features is a hard trigger to pull for a lot of ppl, so that they arn't selling well.
 

ottonis

Reputable
Jun 10, 2020
220
190
4,760
It is mostly marketing, but marketing makes public perception and public perception makes demand. AMD lost the GPU battle because of RT and DLSS, even with better raster perf/price. Consumers view Radeons as inferior because they Geforces have the latest tech and the other don't.

Doubling RT is good, but Nvidia already doubled a couple of times, so to be really competitive they should at least triple RT performance. Also add some AI power to it, and develop FSR 4 to take advantage of all of it.

It doesn't look good, but I trust AMD's ability to at least keep in the game.

I agree. Moreover, I think that AMD has made a huge mistake by not capitalizing on their relative lack of RT-performance: they could have sold this as a "feature" that allows them to undercut nVidia GPU prices by a lot. Just imagine how many more units AMD could have sold if they offered their cards for say 30% less. They would get lower margins, but they would have also sold manyfold more GPUs thus getting much more consideration by developers who would then optimize games for AMD GPUs.
I think that this is exactly the direction and goal that the new AMD graphics chief has announced: to go cheaper and win market share and dvelopers.
AMD better make some hefty price cuts for their existing cards before holiday season.
 
  • Like
Reactions: P.Amini and YSCCC

YSCCC

Commendable
Dec 10, 2022
565
459
1,260
I agree. Moreover, I think that AMD has made a huge mistake by not capitalizing on their relative lack of RT-performance: they could have sold this as a "feature" that allows them to undercut nVidia GPU prices by a lot. Just imagine how many more units AMD could have sold if they offered their cards for say 30% less. They would get lower margins, but they would have also sold manyfold more GPUs thus getting much more consideration by developers who would then optimize games for AMD GPUs.
I think that this is exactly the direction and goal that the new AMD graphics chief has announced: to go cheaper and win market share and dvelopers.
AMD better make some hefty price cuts for their existing cards before holiday season.
Completely agree for this.

And somehow it's a price range issue, Nvidia is obviously going the premium, luxury brand route and charge a lot of brand tax on their products as they are the only one offering true Halo products, much like how Ferrari and Mclaren alike are doing. While a Nissan GTR could sell as it is faster than the Ferrari on track, they have to considerably undercut the price or else ppl will just pay the slightly extra brand tax.
 
  • Like
Reactions: ottonis

ottonis

Reputable
Jun 10, 2020
220
190
4,760
I was tempted by the XTX also when released, but tbf, it is faster in raster than 4080 in raster but much slower in RT and lost the cuda core support for stuffs like PS or video en/transcoding. The price difference was $200, but $200 compared at $1000+ isn't as significant as in the lower tier products, which makes it much less appealing when games traditionally optimise more for Nv.

Somehow I have a feeling this is more consumer feeling issue, with more and more games supporting RT for "best effects", paying upwards of $700 to have unusable performance at all altest features is a hard trigger to pull for a lot of ppl, so that they arn't selling well.
Good points. nVidia has a tremendous advantage with CUDA support from developers in the video and graphics community. AMD could immediately make a name for themselves in that market if they added a few more and potent hardware encoders/decoder accelerators for all these modern codecs such as h.264 /h.265, AV1 and whatnot and if they managed to improve on size and quality of GPU encoded video. While quality is on par, GPU encoders do deliver much larger file sizes compared to CPU encoding, so there is plenty of room for improvement.
The entire vlogging market (which is huge) would fall on their knees before AMD and beg for their GPUs if AMD had to make such an offer with massively beefed up hardware encoding/decoding acceleration.
 

bit_user

Titan
Ambassador
I wonder how much of a difference RT does indeed make nowadays. Is there a substantial body of (AAA)games that rely on RT, and do these do indeed look considerably worse with fewer/lesser RT effects?
Perhaps it's more that GPU buyers are looking for hardware that will also run tomorrow's games well. You'd probably want some degree of future-proofing, especially if buying a more expensive model. Given that Radeon's main selling point is value, value-conscious buyers won't be the sort who upgrade their GPU every year or two, which further supports the idea they're looking for a product with some longevity.

I think that all this RT stuff is more marketing than a real-world issue, because the games that I have seen do look only marginally better with RT when compared to rasterization.
What if game developers start putting a lot less effort into the visual quality of the non-RT mode? Then, the benefit of RT will start to go way up.
 
  • Like
Reactions: P.Amini

bit_user

Titan
Ambassador
AMD could immediately make a name for themselves in that market if they added a few more and potent hardware encoders/decoder accelerators for all these modern codecs such as h.264 /h.265, AV1 and whatnot and if they managed to improve on size and quality of GPU encoded video.
LOL, H.264 is 20 years old! How does it count as "modern"?

When Jarred last tested it, it appeared that AMD had never gone back and improved their H.264 implementation, but their H.265 and AV1 definitely seemed competitive on both quality and performance.
 
  • Like
Reactions: TesseractOrion

ezst036

Honorable
Oct 5, 2018
750
627
12,420
You're right, the Navi 31 GCD *is* kind of small

Yeah.

If AMD would've made the RX 7700 something like the 7400, while the 7800s are actually the 7500, the 7900 are 7600, and the 7900XTX is the RX 7700, that would definitely be competitive with Nvidia.

AMD would need something entirely bigger to be the 7800, and of course even bigger still to be the 7900. A 7900 with 40% more transistors would definitely be competitive with an RTX4090.

The Radeon 9700 followed this model. https://www.tomshardware.com/features/best-amd-gpus-of-all-time

AMD seems to have introduced the "small-die strategy" sometime around the HD 5000s, and it is not working. They need more cores, more shaders.
 

bit_user

Titan
Ambassador
AMD seems to have introduced the "small-die strategy" sometime around the HD 5000s, and it is not working. They need more cores, more shaders.
** sigh **

This isn't the first time I've seen people spouting such rubbish, so I just wasted an hour of my life to make some nice plots to help you see the light.

First, let's look at the specific claim of die sizes:

Uw2zylE.png


You can clearly see several points when AMD's die is either similar or bigger than Nvidia's:
  • HD 7970
  • R9 Fury X
  • RX Vega

Of course, die sizes are only comparable at the points when AMD and Nvidia are on roughly the same node. Starting with Radeon VII, there was some significant divergence. So, now let's look at actual transistor counts:

ZXT7oEx.png


Here, we can see that:
  • HD 7970 had significantly more transistors than the GTX 680
  • R9 Fury X had 11% more transistors than the GTX 980 Ti
  • RX Vega had a few more transistors than the GTX 1080 Ti
  • Radeon VII still had 71% as many transistors as RTX 2080 Ti
  • RX 6950 XT had 94.7% as many transistors as the RTX 3090
  • RX 7900 XTX has 75.6% as many transistors as the RTX 4090

So, it really wasn't until the RTX 2080 Ti, in 2018, that Nvidia really started pulling ahead on transistor count. I'm sure by no coincidence, that's also the first to feature DLSS and ray tracing. Even so, AMD nearly matched the transistor count of the RTX 3090, with the RX 6950 XT.

Sources:
 

subspruce

Proper
Jan 14, 2024
133
33
110
You can't compare just the GCD, though. Nvidia has all the same stuff that's on the MCDs, except it's just integrated into their monolithic die. The only fair way to compare is by adding all the transistors of the GCD + MCDs vs. the transistors of the AD102 die.
well like 30-40% of AD102 is cache
 

bit_user

Titan
Ambassador
well like 30-40% of AD102 is cache
It has 72 MB of L2 cache, which is comparable to Navi 31's 96 MB of Infinity Cache.

It's notable that the RTX 3090 had only 6 MB of L2 cache! Going big on cache is the main thing that enabled the RTX 4000 series to get more performance from narrower memory interfaces, which a lot of people failed to appreciate. They just see the RTX 3070 having a 256-bit memory datapath and the RTX 4070 having just 192-bit feels to them like they're being cheated. They missed how much impact the 9-12x increase in cache is having. Cache costs $$$, which is why AMD had the idea to put most of it on N6 dies.
 
  • Like
Reactions: TesseractOrion

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
I wonder how much of a difference RT does indeed make nowadays. Is there a substantial body of (AAA)games that rely on RT, and do these do indeed look considerably worse with fewer/lesser RT effects?
I think that all this RT stuff is more marketing than a real-world issue, because the games that I have seen do look only marginally better with RT when compared to rasterization.
Yes,, RT is hugely impactful, makes a lot of difference. For example, click the spoiler

445-FDD70-55-DA-47-C8-B19-A-803-A25-B64003.png



F3-B01-A4-F-6118-455-B-97-F6-6-D7858-F8-AC77.png
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
** sigh **

This isn't the first time I've seen people spouting such rubbish, so I just wasted an hour of my life to make some nice plots to help you see the light.

First, let's look at the specific claim of die sizes:
Uw2zylE.png

You can clearly see several points when AMD's die is either similar or bigger than Nvidia's:
  • HD 7970
  • R9 Fury X
  • RX Vega

Of course, die sizes are only comparable at the points when AMD and Nvidia are on roughly the same node. Starting with Radeon VII, there was some significant divergence. So, now let's look at actual transistor counts:
ZXT7oEx.png

Here, we can see that:
  • HD 7970 had significantly more transistors than the GTX 680
  • R9 Fury X had 11% more transistors than the GTX 980 Ti
  • RX Vega had a few more transistors than the GTX 1080 Ti
  • Radeon VII still had 71% as many transistors as RTX 2080 Ti
  • RX 6950 XT had 94.7% as many transistors as the RTX 3090
  • RX 7900 XTX has 75.6% as many transistors as the RTX 4090

So, it really wasn't until the RTX 2080 Ti, in 2018, that Nvidia really started pulling ahead on transistor count. I'm sure by no coincidence, that's also the first to feature DLSS and ray tracing. Even so, AMD nearly matched the transistor count of the RTX 3090, with the RX 6950 XT.

Sources:
And I thought for a moment you don't care about transistor counts. Oh lord