News CEO Lisa Su says AMD is a data center-first company — DC revenue topped $2.8 billion last quarter, over 4X higher than its gaming business sales

usertests

Distinguished
Mar 8, 2013
928
839
19,760
That explains why Ryzen 9000 has poor gaming performance. The chiplets are optimized for DC related and not for desktops.
Tom's Hardware and Phoronix seemingly got good results. I think it's unoptimized on Windows. But even in the best case scenarios, the IPC boost is modest, AVX-512 isn't relevant to most gaming (maybe emulators), and clock speeds have hit a brick wall.

There were some other oddities like the cross-CCD latency being abnormally high compared to Zen 4.
 
AMD did a pretty serious rework with Zen 5 so it's not surprising that the gains aren't universal. In the past when they've made these types of changes they've also been coupled with clockspeed increases which isn't the case here. The biggest noticeable change I'd say is actually in full load efficiency.

With so much of their revenue coming from datacenter the biggest thing consumers will notice is likely availability related. I'd imagine to some degree this has also driven their pivot on consumer graphics as you can get a lot more low-mid range die per wafer.
 
Last edited:
  • Like
Reactions: bit_user

waltc3

Honorable
Aug 4, 2019
453
251
11,060
Slow progress in graphics...? I guess you mean that AMD isn't selling $2k GPUs?...;) My 6900XT is a keeper. It runs great. It's a beautiful card @ 4k. nVidia is also making more in its Data-Center/AI sales than its gaming sales. Does that mean nVidia also has slow progress in graphics?...;)
 
  • Like
Reactions: bit_user

DS426

Upstanding
May 15, 2024
254
189
360
Remember that AMD was a purebred CPU company up until the ATI acquisition in 2006. It didn't take long after for them to be writing the masterclass on integrated graphics. That's why everyone is going to be mindblown if MI300A really doesn't shine as it should in theory be their most glowing product.

"Slow progress in graphics" but GPU sales have significantly accelerated with AMD expecting to make over $1B/yr just in DC GPU revenue. That's probably ten-fold up from what they were doing just five or six years ago.
 
  • Like
Reactions: bit_user
Dont lose sight of who gotcha where you are today Lisa. When the Ai bubble plateaus, youll want those diy and gamers back.
I was thinking along those lines, but then I realized that's false.

Otherwise they wouldn't be the laughing stock marketshare wise on the consumer side on both fronts where they've been struggling, even now, to get marketshare from Intel.

So, no, AMD (or Lisa) owes the consumer market diddly squat.

Regards.
 

Kondamin

Proper
Jun 12, 2024
114
73
160
Does that mean nVidia also has slow progress in graphics?...;)
Well yes, if they cared about gaming graphics they wouldn’t be shoving stuff like dlss and other inference garbage down our throats and instead give us more hardware that rendered the actual graphics
 
  • Like
Reactions: Peksha
Well yes, if they cared about gaming graphics they wouldn’t be shoving stuff like dlss and other inference garbage down our throats and instead give us more hardware that rendered the actual graphics
not entirely true.

DLSS itself isn't a bad idea really.
Issue is how its done & the harm that came from it being released.

DLSS does make games playable on systems that couldnt normally play a game at decent quality/framerate. Thats the ideal case and how it should of been.

Now how its used? By devs to cut optimizing their games now as they can just say "use this" and now we get games that look meh while having crazy high req (i.e. starfield).

Pure raster "has" limits that even if you dont hit its smart to try and find ways to improve performance along side that.

and this is my view as a person who doesnt use DLSS/FSR.
 
I was thinking along those lines, but then I realized that's false.

Otherwise they wouldn't be the laughing stock marketshare wise on the consumer side on both fronts where they've been struggling, even now, to get marketshare from Intel.

So, no, AMD (or Lisa) owes the consumer market diddly squat.

Regards.
AMD only struggles, like other cpu makers (Apple anyone?) to get market share from Intel because they are a much larger company who has a foot hold in most markets for a long time. And they marketed the crap out of every product, even when AMD's offerings were better. Most govt contract institutions buy Intel based machines because of habit IMO and well, stability on a large scale is a thing too, up until very recently. Intel has mindshare as does MSFT. That said, when they Ai bubbles does plateau and everyone is still running Intel in their data centers (and Nvidia), where will AMD turn to for income? I don't know either. But then again, I dont use their products anymore either. And yes, over the years, I have given AMD (and ATI) many chances, but after all those years, here I am, like most people, with an Intel/Nvidia platform.
 

bit_user

Titan
Ambassador
That explains why Ryzen 9000 has poor gaming performance. The chiplets are optimized for DC related and not for desktops.
I agree with this. If you look at the relative size and complexity of Zen vs. Intel cores, the Zen cores are smaller and architected not to clock as high. That suggests AMD was looking to achieve a better compromise for mobile and especially server.

The downside is that AMD has generally lagged Intel in single-thread performance, except for a brief period between the launches of Ryzen 5000 and Alder Lake. Then, between Ryzen 7000 and Raptor Lake. Finally, between Ryzen 9000 and Arrow Lake. So, basically the single-thread performance of Zen cores are lagging Intel's by about 1 generation.

On the other hand, the 16-core Zen models are typically able to counter Intel on multi-threaded performance, in spite of Intel's superior hybrid strategy.

For AMD, it's tricky and risky to relinquish too much ground on the client market, because client is what drives mindshare among the public, investors, and many of the same geeks who are involved in decisions about what server hardware to purchase or which types of VM instances to rent.

AMD must continue to be thoughtful and clever about how it competes with Intel and others. Chiplets and 3D Cache were great examples, but AMD needs to keep them coming. The hybrid move was a daring and strong counter by Intel, as are things like their sideband (L4) cache. AMD might not be competitive at all price points, but I think they really can't afford to turn their backs on any of the major markets where they play.
 
  • Like
Reactions: thestryker

bit_user

Titan
Ambassador
AMD only struggles, like other cpu makers (Apple anyone?) to get market share from Intel because they are a much larger company who has a foot hold in most markets for a long time. And they marketed the crap out of every product, even when AMD's offerings were better.
Yeah, but don't sell Intel's engineering short. Golden Cove was truly a good core and the hybrid strategy made it viable to use such big P-cores. Otherwise, Alder Lake and Raptor Lake would've lagged badly, on heavily-multithreaded performance. They'd have still been fine gaming and productivity CPUs, but much less attractive to power users.

What I think really would've happened is that Intel just wouldn't have made Golden Cove so big and complex, had they not been planning on building hybrid CPUs aronud it. So, we would've gotten CPUs from Intel that were less good at everything.

The main downside of Intel's big P-core strategy is that it killed them in the server market. It held back their core counts and that compounded their power efficiency deficit vs. AMD. Perhaps Intel thought they could use accelerators in a similar way as E-cores, but that hasn't really played out as such.

Most govt contract institutions buy Intel based machines because of habit IMO and well, stability on a large scale is a thing too, up until very recently. Intel has mindshare as does MSFT. That said, when they Ai bubbles does plateau and everyone is still running Intel in their data centers (and Nvidia), where will AMD turn to for income?
AMD's datacenter marketshare has been growing at a brisk pace. They're rapidly approaching 30%. If you ask around, you're probably about as likely to find people whose default choice is now AMD's EPYC as you will those who prefer Intel's Xeons.

We saw this put to the test, just about 2 years ago, when datacenters and hyperscalers massively hit the brakes on new server purchases. It hurt Intel a lot worse than it did AMD. And AMD's lead in server CPUs has only grown since then (i.e. neither Bergamo nor Genoa-X existed, back then).

QgQXhsFvjoZnagViq9xeTH.png

 
Last edited:
The amd fanboy's will cry now... "they don't care about us"
Nvidia will eat all those market from AMD. We are proud of be a ded center
First ryzen (epyc disguised) and now The G-dna Not good for game not good for enterprise.
 

valthuer

Prominent
Oct 26, 2023
166
166
760
DLSS itself isn't a bad idea really.
Issue is how its done & the harm that came from it being released.

DLSS does make games playable on systems that couldnt normally play a game at decent quality/framerate. Thats the ideal case and how it should of been.

Now how its used? By devs to cut optimizing their games now as they can just say "use this" and now we get games that look meh while having crazy high req (i.e. starfield).

I totally agree.

DLSS was supposedly for gamers to get that extra bit of performance... Now you have poorly optimized games using it as as crutch... As if to say "oh it's OK that this sucks because DLSS will pick up the slack."
 
  • Like
Reactions: bit_user

jp7189

Distinguished
Feb 21, 2012
510
298
19,260
Slow progress in graphics...? I guess you mean that AMD isn't selling $2k GPUs?...;) My 6900XT is a keeper. It runs great. It's a beautiful card @ 4k. nVidia is also making more in its Data-Center/AI sales than its gaming sales. Does that mean nVidia also has slow progress in graphics?...;)
I believe they are referencing AMDs announcement to give up on high end cards. Enjoy that 6900, they won't be producing that tier anymore.
 

bit_user

Titan
Ambassador
I believe they are referencing AMDs announcement to give up on high end cards. Enjoy that 6900, they won't be producing that tier anymore.
He didn't exactly say that. What he said was that they want to focus on competition in the mid-range and below. He still left open the potential for a chiplet-based product to address the high-end.

"Don’t worry. We will have a great strategy for the enthusiasts on the PC side, but we just haven’t disclosed it. We'll be using chiplets, which doesn't impact what I want to do on scale, but it still takes care of enthusiasts"

Source: https://www.tomshardware.com/pc-com...ck-hyunh-talks-new-strategy-for-gaming-market

I'd say be skeptical about AMD's future high-end offerings, but I wouldn't count them out, entirely.
 
  • Like
Reactions: -Fran- and jp7189

jp7189

Distinguished
Feb 21, 2012
510
298
19,260
He didn't exactly say that. What he said was that they want to focus on competition in the mid-range and below. He still left open the potential for a chiplet-based product to address the high-end.
"Don’t worry. We will have a great strategy for the enthusiasts on the PC side, but we just haven’t disclosed it. We'll be using chiplets, which doesn't impact what I want to do on scale, but it still takes care of enthusiasts"​

I'd say be skeptical about AMD's future high-end offerings, but I wouldn't count them out, entirely.
I held out hope that AMD would crack the gpu chiplet problem.. no mcus, but actual gpu chiplets. That breakthrough would lead to a mini Renaissance for client gpus in my opinion.

They've shown it's possible for compute, but not so much for realtime frame rendering.

Anyway, let's say I've stopped holding my breath at this point.
 

bit_user

Titan
Ambassador
I held out hope that AMD would crack the gpu chiplet problem.. no mcus, but actual gpu chiplets. That breakthrough would lead to a mini Renaissance for client gpus in my opinion.
That's what they reportedly tried for RX 8000, by using die-stacking, but it didn't pan out.

They've shown it's possible for compute, but not so much for realtime frame rendering.
Their slide deck for RX 7000 explained why it's a much harder problem for GPUs than CPUs. The data movement is a couple orders of magnitude greater, within a GPU.

Apple's M1 Ultra was the first multi-die GPU that truly presented itself to software as a single GPU. It only had 2 dies and an interconnect bandwidth between them of 2.5 TB/s per direction. That puts the aggregate about equal to the RX 7900 XTX's MCD <-> GCD bandwidth. However, Apple's M1 Ultra had rendering performance a couple tiers below the RTX 3090-level they were aiming for. So, I don't know if you could call it a terribly effective solution. It definitely scaled performance beyond that of a M1 Max. I wish I knew more about just how well it did scale.

Anyway, let's say I've stopped holding my breath at this point.
Just because it wasn't feasible with yesterday's technology doesn't mean it can't or won't happen. Chiplet and substrate technology is continually improving. Nvidia finally went to multi-die, for the first time, with Blackwell. AMD first did this with the MI200 series, and doubled down on chiplets in the MI300 series.

The RX 7000 series was only AMD's first effort at using chiplets for client GPUs. Every time you try something new and difficult, there are things you learn that enable improvements in the next go-around. I think the tech just isn't quite there, yet. That doesn't mean it won't get there.

BTW, I think GPUs are more forgiving of defects than CPUs, which makes chiplets less of a win for them. In fact, the main reason AMD gave for using chiplets in the RX 7000 series was to take advantage of cheaper nodes for IO and SRAM, which don't scale down well to N5 and below. So, it was mainly about trying to offer more performance per $, as opposed to way more performance in the absolute sense. Maybe, if their GCD had been nearly as large as the RTX 4090's die, we'd be singing a different tune about the RX 7900 XTX.
 
Last edited:
  • Like
Reactions: jp7189