News AMD rakes in cash with best quarterly revenue ever amid datacenter business rise, but gaming business craters

not a shock tbh.
AMD's basically got datacenter on lock down w/ EPYC. (and datacenter is most profitable of the groups)

Gaming also not shock as consoles are big chunk & sales drop off heavily after a while. Just keep hope amd doesnt do an nvidia and only care about ai.
 
  • Like
Reactions: Jagar123

Eg0

Distinguished
Mar 1, 2015
11
5
18,515
It seems you will never see Lisa do anything that really challenges her cousin. Every time they have a something with a potential of impacting Nv's lead they pull it.
 

bit_user

Titan
Ambassador
It seems you will never see Lisa do anything that really challenges her cousin.
First of all, too much is made of the family relation, as they're actually like second or 3rd cousins. I think they didn't even know they were related, until someone pointed it out to them. Even then, I still don't see why she would hold back from competing with him, especially when he clearly feels no such obligation. If anyone needs "protecting", it's surely not Jensen/Nvidia!

Secondly, RDNA2 actually managed to take a slight lead on raster performance over RTX 3000. Go back and look at the benchmarks of RX 6950X vs. RTX 3090!

The simple fact is that Nvidia is very good at graphics and a very aggressive competitor. With RDNA, AMD made huge strides in competitiveness at gaming performance, but it's just not easy to beat Nvidia. If you need further evidence, just look at how the mighty Intel has struggled to match even AMD on dGPU performance! It's not that AMD is bad, just that Nvidia is really good and ruthlessly competitive.

Every time they have a something with a potential of impacting Nv's lead they pull it.
You're talking about their die-stacked RDNA4 GPU? I think they pulled it because it wasn't competitive on perf/$ or perf/W, or maybe there were manufacturing problems.
 

EzzyB

Great
Jul 12, 2024
47
37
60
not a shock tbh.
AMD's basically got datacenter on lock down w/ EPYC. (and datacenter is most profitable of the groups)

Gaming also not shock as consoles are big chunk & sales drop off heavily after a while. Just keep hope amd doesnt do an nvidia and only care about ai.
I think it already has. The reason they announced they simply won't compete Nvidia's high end gaming cards is that those resources are better spent on datacenter AI.

There's only so much design and manufacturing resources available at that top end and the AI stuff makes more money.
 
  • Like
Reactions: prtskg

suryasans

Distinguished
Jul 19, 2008
94
45
18,560
The profit margins on AMD's data center products are higher than its consumer products. So, it's not a surprise if the scheduled IC wafers productions for AMD in TSMC's leading edge fabs have been placed for its data center products.
 
  • Like
Reactions: prtskg

logainofhades

Titan
Moderator
AMD's terrible marketing is why their gaming side has went down. Price/performance unless RT is involved, is often superior to that of Nvidia, along with AMD often being more generous on the vram, at a given price tier.
 

pug_s

Distinguished
Mar 26, 2003
482
76
18,940
AMD's terrible marketing is why their gaming side has went down. Price/performance unless RT is involved, is often superior to that of Nvidia, along with AMD often being more generous on the vram, at a given price tier.
I would disagree. It is purely because of profit and AMD just decided to put its R&D and other resources into making CPU's. The latest 7000 radeon GPU's are still stuck on 6nm while its 9000 cpu's are already on 3nm. Its GPU dies are larger and don't make much profit compared to smaller CPU dies.
 

logainofhades

Titan
Moderator
I would disagree. It is purely because of profit and AMD just decided to put its R&D and other resources into making CPU's. The latest 7000 radeon GPU's are still stuck on 6nm while its 9000 cpu's are already on 3nm. Its GPU dies are larger and don't make much profit compared to smaller CPU dies.

Their market share keeps dwindling, despite plenty of stock. That's why I say it's poor marketing.
 
As far as being a profitable tech company is concerned the datacenter focus shows letting gaming go hasn't been to their detriment. It will be interesting to see if AMD really wants to gain in gaming. I suppose we'll find out when RDNA 4 is launched as that should tell us everything.
AMD's terrible marketing is why their gaming side has went down. Price/performance unless RT is involved, is often superior to that of Nvidia, along with AMD often being more generous on the vram, at a given price tier.
While their marketing certainly plays a part as it's been pretty terrible when you're not the dominant player pricing between your competition isn't going to gain you marketshare. If AMD wants to gain marketshare in gaming it will have to be by offering a truly superior choice and the quickest way to do so is simply price. If the 7000 series launches lower priced (ex: 7600 @ $200, 7900 XT @ $600 and 7900 XTX @ $800) we're likely looking at a whole different market. They wouldn't overtake Nvidia because as @bit_user points out they've been executing almost perfectly, but they'd be gaining.

I don't think AMD's had anything particularly bad the last few generations if you toss out the crypto fueled mobile rebrands, but their prices just don't make them worth it (outside of sales). When I got my 3080 12GB it and the 6800 XT were the same price and raster performance, but 3080 RT much further ahead so AMD wasn't really a logical choice.

Even if AMD has performance and feature parity nvidia controls so much of the market they'd still need a good price advantage to make an impact.
 
  • Like
Reactions: prtskg and P.Amini

logainofhades

Titan
Moderator
Today's pricing is different, and has been for awhile.The 7700xt 12gb is faster than a 4060ti 8gb or 16gb, for a similar price to the 8gb model. 6800xt/7800xt 16g, are a bit faster than the vanilla 4070 12gb, while being cheaper. The 4070 12gb is priced about the same as the 7900gre 16gb, which is slightly faster than the more expensive 4070s 12gb. The 7900xt 20gb is priced similar to a 4070ti 12gb, and is slightly faster than the more expensive 4070ti super 16gb. The 7900xtx 24gb is slightly faster than a 4080s, and over $100 cheaper. Unless you need an Nvidia specific feature, AMD gpu's are a better deal, right now.
 
Based on TPU 1440p testing:
  • 7700 XT +18% raster, -14% RT compared to 4060 Ti 16GB which costs ~16% more
  • 7800 XT +5.6% raster, -20% RT compared to 4070 which costs ~10% more
  • 7900 GRE +1% raster, -25% RT compared to 4070S which costs ~11% more
  • 7900 XT +9% raster, -17% RT compared to 4070 Ti which costs ~12.5% more
  • 7900 XTX +2.4% raster, -23% RT compared to 4080 Super which costs ~15% more
The only really compelling one is the 7700 XT (7900 XT the VRAM argument is compelling) because while they all offer more for less it's not always significant or universal. You lose DLSS which is generally superior to FSR when choosing AMD as well. This is also extremely late in the product life cycles so the generation has long since been decided. It's not that AMD isn't necessarily a better deal, but rather they're not enough of one.
 

DavidLejdar

Respectable
Sep 11, 2022
286
179
1,860
And the stock price somewhat crashed. Go figure. Were many in it, just hoping for revenue numbers as Nvidia's? Ah, well...

I quite share the overall positive outlook. Internet connectivity is still improving all over the world, giving also already connected companies more options, such as for working in a cloud, and also increasing need for data storage options. (Something which can increase a lot, with efforts for economic growth of developing countries.)

There is the question about market share though, when talking about a particular company, such as AMD. Specifically, if the market grows within x years tenfold, while the company's revenue increase within these x years only fivefold, did the company do well enough?

I mean, as shareholder, not that I would have an expectation of "Twentyfold!" - AMD positioned itself likely well for some segments, with recently having acquired a company with many system engineers. So, it is not like one could complain, as shareholder, about AMD not having a plan to seize the opportunity of overall market growth (leaving aside the question, about how well these segments work for AMD internationally). - But when measuring a company's performance mainly by market share, it can show some interesting context. Like, when the market's value halves year-on-year, while a company has 25% lower revenue on that year-on-year, then there is an argument, that the company performed actually quite well, considering the circumstances of a such a market.

This is a simplified version of how one can measure a company's performance, of course. When one is a big shareholder, like of 5%, one may want to look way deeper into the numbers, such as how much the revenue of the acquired company contributed to the numbers presented. But I just meant to talk about, why I, as a shareholder, don't mind about AMD's gaming division having quarterly low revenue, just before the likely release of new GPUs.
 

bit_user

Titan
Ambassador
I would disagree. It is purely because of profit and AMD just decided to put its R&D and other resources into making CPU's. The latest 7000 radeon GPU's are still stuck on 6nm while its 9000 cpu's are already on 3nm. Its GPU dies are larger and don't make much profit compared to smaller CPU dies.
First, Radeon RX 7700/7800 and RX 7900 use TSMC N5 for the compute portion and N6 for the cache and memory controller portions. That's because cache and I/O don't scale down well, so it was done to save money and because there wouldn't really have been much benefit, if they'd done those blocks in N5.

Second, RDNA3 began launching back in 2022, the same year as Ryzen 7000. Both used TSMC N5, for the compute dies and N6 for the IO. So, there was no disparity between CPUs and GPUs, contrary to your claims. Granted, the RX 7800 didn't launch until the following year, but that seems mainly because AMD had a big inventory problem to sort out, and that amount of time still wouldn't have been enough for them to port the RX 7800 to a newer node.

The only one of the new RX 7000 GPUs, that wasn't a simple rebadge, to use TSMC N6 is the 7600. That GPU was also the only one to be monolithic.

Finally, I don't know where you got the idea that Ryzen 9000 is on TSMC N3, but it's not. No part of it is. All models use N4P for the compute portion, which is a N5-family node. Also, they reused the same N6-based IO Die as the Ryzen 7000 series.
 
Last edited:

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
502
2,060
Based on TPU 1440p testing:
  • 7700 XT +18% raster, -14% RT compared to 4060 Ti 16GB which costs ~16% more
  • 7800 XT +5.6% raster, -20% RT compared to 4070 which costs ~10% more
  • 7900 GRE +1% raster, -25% RT compared to 4070S which costs ~11% more
  • 7900 XT +9% raster, -17% RT compared to 4070 Ti which costs ~12.5% more
  • 7900 XTX +2.4% raster, -23% RT compared to 4080 Super which costs ~15% more
The only really compelling one is the 7700 XT (7900 XT the VRAM argument is compelling) because while they all offer more for less it's not always significant or universal. You lose DLSS which is generally superior to FSR when choosing AMD as well. This is also extremely late in the product life cycles so the generation has long since been decided. It's not that AMD isn't necessarily a better deal, but rather they're not enough of one.
2 things I disagree with here, the RT performance difference is misleading, nvidia cards are actually a lot faster in RT than the numbers will tell you. You can't calculate the RT capabilities just by games unless those games make excessive use of RT. To draw a frame the 7700xt spends 10ms to finish the raster calculations, and 7 ms to finish the RT calculations, total of 17ms. The 4060ti takes 12ms to finish the raster calculations and 4 ms to finish the RT calculations. The difference might seem like 10% but in reality it's 40%.

Especially considering a lot of games are very light in RT (all the amd sponsored ones basically).

But on the other hand, is RT very useful in those lower tiers? Realistically both a 4060, a 4060ti and a 4070 will struggle to play anything decent with RT on, so their RT superiority is kinda meaningless. I'd go for a 7800xt over the 4070 honestly.
 
2 things I disagree with here, the RT performance difference is misleading, nvidia cards are actually a lot faster in RT than the numbers will tell you. You can't calculate the RT capabilities just by games unless those games make excessive use of RT. To draw a frame the 7700xt spends 10ms to finish the raster calculations, and 7 ms to finish the RT calculations, total of 17ms. The 4060ti takes 12ms to finish the raster calculations and 4 ms to finish the RT calculations. The difference might seem like 10% but in reality it's 40%.
FPS is FPS dude it doesn't matter that AMD's RT is a lot worse when talking absolute performance.
But on the other hand, is RT very useful in those lower tiers? Realistically both a 4060, a 4060ti and a 4070 will struggle to play anything decent with RT on, so their RT superiority is kinda meaningless.
Depends on the game as there are a handful which are fine at the 4060 Ti level. Fortunately several games have separated out RT features so you can just turn on what's important for visual quality which makes it more relevant. I'd say it really doesn't start until the 4070 level to be more broad. The problem with AMD's anemic RT is that while you could get RT benefits in some stuff with a 4060 Ti/4070 you absolutely aren't with a 7700 XT/7800 XT. I wouldn't hold this against the 7700 XT since usefulness is so limited at this level, but it becomes more of an issue as you go up.
I'd go for a 7800xt over the 4070 honestly.
They're both a bad deal really. I bought a 6800 XT over a year ago for $430 and the cheapest 7800 XT is around $450 right now.

The 7700 XT and 7900 XT are probably the best video card buys at current pricing.
 
  • Like
Reactions: bit_user

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
502
2,060
FPS is FPS dude it doesn't matter that AMD's RT is a lot worse when talking absolute performance.
Of course it does. Averages are worthless when the are huge swings both ways. People who care about RT don't care about RT like the one in resident evil village and farcry 6 which looks worse than raster. There are only a handful of good games that RT transforms the experience, those are heavy - lots of RT effects and the nvidia cards punch way above their weight in those.

Just looking at the averages, the 7800xt is better in RT than the 4060ti, but when it comes to the actual games you want to use it on, the 4060ti is faster. Now i'd argue both are useless - but when when you go higher up the stack (4070ti vs 7900xt) the comparison makes sense.
 
Of course it does. Averages are worthless when the are huge swings both ways. People who care about RT don't care about RT like the one in resident evil village and farcry 6 which looks worse than raster. There are only a handful of good games that RT transforms the experience, those are heavy - lots of RT effects and the nvidia cards punch way above their weight in those.
If you're arguing transformational experience then you're limited to the two path tracing games which the 4090 can barely manage and the Metro Exodus RT version. There simply aren't many games where RT does anything beyond enhancement.
Just looking at the averages, the 7800xt is better in RT than the 4060ti, but when it comes to the actual games you want to use it on, the 4060ti is faster.
This is a very generalized statement which simply isn't universally true. It sure is for some titles, but also isn't for others. The Spider-Man games are good examples of RT light titles that make good use of the effects they use and AMD does fine here. CP2077 is a good example of what you're talking about where there are a ton of effects that make a difference and kill performance.
 
  • Like
Reactions: bit_user

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
502
2,060
If you're arguing transformational experience then you're limited to the two path tracing games which the 4090 can barely manage and the Metro Exodus RT version. There simply aren't many games where RT does anything beyond enhancement.

This is a very generalized statement which simply isn't universally true. It sure is for some titles, but also isn't for others. The Spider-Man games are good examples of RT light titles that make good use of the effects they use and AMD does fine here. CP2077 is a good example of what you're talking about where there are a ton of effects that make a difference and kill performance.
There are more games, though older. Dying light 2, Control, R&C. But usually most games (amd sponsored cough cough) have only rt shadows running at 1/4th the normal resolution and looking like ass.
 

systemBuilder_49

Distinguished
Dec 9, 2010
103
38
18,620
The simple fact is that Nvidia is very good at graphics and a very aggressive competitor. With RDNA, AMD made huge strides in competitiveness at gaming performance, but it's just not easy to beat Nvidia. If you need further evidence, just look at how the mighty Intel has struggled to match even AMD on dGPU performance! It's not that AMD is bad, just that Nvidia is really good and ruthlessly competitive.
I think the truth is, AMD has NOT attempted to compete with NVidia in this decade. They always select the previous-generation node for all their graphics chips. While AMD is using 5nm, NVidia is using 4nm and deriding them for using too much power (well duh. That's what happens when you aren't using a state-of-the-art VLSI node). They see themselves as a value player that is incapable of truly challenging the 800-pound gorilla, NVidia. They have never really tried.
 

bit_user

Titan
Ambassador
I think the truth is, AMD has NOT attempted to compete with NVidia in this decade. They always select the previous-generation node for all their graphics chips. While AMD is using 5nm, NVidia is using 4nm and deriding them for using too much power
That's not really true. AMD had the first GPUs on TSMC N7. In the meantime, Nvidia was on 12 nm, then Samsung 8 nm. It's only with RTX 4000 that Nvidia got back in the lead on process tech.

Also, TSMC "4N" is a special N5-derivative for Nvidia. I don't know how much better it is than their regular N5 node, but it's definitely not a full node better.

They see themselves as a value player that is incapable of truly challenging the 800-pound gorilla, NVidia. They have never really tried.
That's not true. Fury was neck-and-neck with the GTX 980 Ti. They thought they could repeat that success with Vega, but it ended up being a pretty big disappointment. In fact, I think Nvidia was genuinely worried, which caused them to price the GTX 1080 Ti lower than they otherwise would've.

Yeah, RDNA first aimed at the mid-range, but RDNA2 was a true contender for flagship status. See my previous point about RX 6950X vs. RTX 3090. Of course, enabling that feat was AMD's double-advantage of Infinity Cache and a slight node advantage. Once Nvidia got onto a comparable (slightly better) node and added a similar amount of cache, they easily retook the lead.

Whatever you think about the current generations, an honest read of the RX 6000 vs. RTX 3000 matchup should conclude that AMD was both competing for flagship status and (aside from ray tracing or DLSS) arguably achieved it.