News AMD outsells Intel in the datacenter for the first time in Q4 2024

AMD > Intel

They won't ever beat Nvidia the way things are stacked already, but they could do much better if they stopped playing catch up. Just Give gamers a great rasterization card for a great price and we are good
"They won't ever beat Intel" is a phrase that was used a lot 10 years ago, and look where we are. Not saying they can pull a Ryzen trick on GPUs, since these are more evolutionary than revolutionary, and Nvidia is in a very good shape unlike quad-core-max-Intel, but one mistake from green team and one hit from red team could sure turn the game around.
 
"They won't ever beat Intel" is a phrase that was used a lot 10 years ago, and look where we are. Not saying they can pull a Ryzen trick on GPUs, since these are more evolutionary than revolutionary, and Nvidia is in a very good shape unlike quad-core-max-Intel, but one mistake from green team and one hit from red team could sure turn the game around.
Actually right till now some are still claiming the Arrow lakes are the best available and AMD only is second best, and evidence is that datacenters and coporate still very lean to intel chips, now waiting for the fake news claimers chime in
 
  • Like
Reactions: Elusive Ruse
Just Give gamers a great rasterization card for a great price and we are good
in past sure, but now? thats likely not going to work.
Indiana jones game is showing that devs are going to RT as a base req for future and raster is no longer going to be good enough.
but one mistake from green team and one hit from red team could sure turn the game around.
and what exactly would a mistake be?
AMD gave up high market.
even a "mistake" likely wouldnt do much at this point.

AMD's only real advantage is they don't skimp on VRAM like nvidia does.


on CPU side AMD is effectively untouched as Intel's the 1 playing catch up (thoguh nvidia might enter that field as they are supposedly working on ARM cpu's)
 
  • Like
Reactions: P.Amini
AMD > Intel

They won't ever beat Nvidia the way things are stacked already, but they could do much better if they stopped playing catch up. Just Give gamers a great rasterization card for a great price and we are good
There are many gamers out there (including me!) who do NOT care about Raytracing and other new fancy crap like fake frames and other AI stuff. Make cards for those gamers that won't cost a leg and an arm. There's a reasonable, maybe even HUGE market out there for them!
 
To me the real shocker is AMD's gross margins beating out Intel's.
Intel is still cutting labor and restructing; they've always been a much larger company than AMD (although it's the smallest gap ever now), so when revenues are similar between the two, you bet AMD will have better margins on that revenue. Next fiscal year should show better margins for Intel, assuming they can stop the DCAI hemorrhaging .

I'm still surprised that they were able to turn a [small] profit on the much lower gaming GPU sales. Also impressive to see the gains in client, even if not entirely impressive.

As for MI300... nVidia has long won the hearts of minds of every kind of GPU buyer, whether AI, general datacenter, enterprise & professional, or gamer, so... yup, this will continue to be an uphill battle for the foreseeable future. Now, I'd like to think MI325 and 355 will help gain a tad bit of traction, but we'll have to see.
 
Nvidia has a horrible 5080/5090 launch and guess what? All 7900XTX are sold out except scalped/overpriced models. Is AMD even making them anymore? Unlikely the 9070XT can compete given that AMD already has admitted it'll be slower than their flagship.
 
There are many gamers out there (including me!) who do NOT care about Raytracing and other new fancy crap like fake frames and other AI stuff. Make cards for those gamers that won't cost a leg and an arm. There's a reasonable, maybe even HUGE market out there for them.
I get where you are coming from, but is basically what AMD tried with RDNA2 cards and they didn't really gain any ground at all against Nvidia. And game devs are pushing more and more mandatory raytracing stuff into their games (and also expect the next generation of consoles to want more raytracing stuff as well). AMD's future cards are going are being unified with the high end data center stuff again (UDNA is what they are calling the next architecture), so they will have the compute resources to do the AI stuff as well.

I do think AMD's best chance of becoming competitive in high end GPUs again is getting the chiplet approach working better; RDNA3 was their first attempt at that, but ran into issues. AMD is pushing hard on chiplets and interconnects with their high end datacenter stuff, and is ahead of Nvidia in that regard; if they can effectively glue together compute dies and not just cache dies like RDNA3 then they could make something with a lot more die space for less cost than the large monolithic chips Nvidia uses in their high cards.
 
  • Like
Reactions: P.Amini
There are many gamers out there (including me!) who do NOT care about Raytracing and other new fancy crap like fake frames and other AI stuff. Make cards for those gamers that won't cost a leg and an arm. There's a reasonable, maybe even HUGE market out there for them!
"Graphics are good enough, stop developing new functions!" has not once worked in the half-century history of real-time graphics, and it's not going to suddenly change now.
 
  • Like
Reactions: P.Amini
isnt that a completely different segment?
AMD does not sell network devices.
AMD's data center segment according to AMD:
The Data Center segment primarily includes server microprocessors (CPUs), graphics processing units (GPUs), accelerated processing units (APUs), data processing units (DPUs), Field Programmable Gate Arrays (FPGAs), Smart Network Interface Cards (SmartNICs), Artificial Intelligence (AI) accelerators and Adaptive System-on-Chip (SoC) products for data centers.
 
  • Like
Reactions: Loadedaxe
"Graphics are good enough, stop developing new functions!" has not once worked in the half-century history of real-time graphics, and it's not going to suddenly change now.
maybe, but he isnt wrong. quite a few people i work with, and know dont care about RT as well, at least not yet. what they really dont care about., is DLSS and frame gen, those are just crutches that nvidia is using to make up the performance they not giving us, by gimping their cards. when RT can be usefully used on card that isnt a xx90 card, and cost 400 or less.... then it might be worth it...

why do you think there are some that are saying since the 40 series, the naming of their cards have gone up a tier, while the price matches that tier but the specs, and even the performance seems to be a tier lower ?
 
  • Like
Reactions: JakeTheMate
maybe, but he isnt wrong. quite a few people i work with, and know dont care about RT as well, at least not yet. what they really dont care about., is DLSS and frame gen, those are just crutches that nvidia is using to make up the performance they not giving us, by gimping their cards. when RT can be usefully used on card that isnt a xx90 card, and cost 400 or less.... then it might be worth it...

why do you think there are some that are saying since the 40 series, the naming of their cards have gone up a tier, while the price matches that tier but the specs, and even the performance seems to be a tier lower ?
As the game industry continues to transition to ray tracing, it won't matter whether you care or not. If you want to play current releases, you're going to have to have a GPU with solid ray tracing performance.
 
  • Like
Reactions: P.Amini
As the game industry continues to transition to ray tracing, it won't matter whether you care or not.
and if you dont play games with RT ?? hence why those i work with and know, dont care about it.. but you also seem to glance over the part of " right now " that i also said...

If you want to play current releases, you're going to have to have a GPU with solid ray tracing performance.
and that still seems to be a xx90 series card, and maybe an xx80 series... and the xx90 cards are way to overpriced and expensive, and anything else, isnt worth the money

good thing other then what was it.. indiana jones... games have the option to turn that off.
 
and if you dont play games with RT ?? hence why those i work with and know, dont care about it.. but you also seem to glance over the part of " right now " that i also said...


and that still seems to be a xx90 series card, and maybe an xx80 series... and the xx90 cards are way to overpriced and expensive, and anything else, isnt worth the money

good thing other then what was it.. indiana jones... games have the option to turn that off.
It's going to matter for any GPU that someone buys today unless you're among the select few that replaces their GPU every year or two. 3, 4, 5 years down the line, ray tracing is going to become the standard and games like Indiana Jones will be the norm, not the exception.
 
and what exactly would a mistake be?
AMD gave up high market.
even a "mistake" likely wouldnt do much at this point.

AMD's only real advantage is they don't skimp on VRAM like nvidia does.


on CPU side AMD is effectively untouched as Intel's the 1 playing catch up (thoguh nvidia might enter that field as they are supposedly working on ARM cpu's)
They gave up big GPUs, but not top tier technology.

A mistake would be, basically, to sit on their laurels. Spend a lot of time to launch a product that isn't better than the last one (like Intel's many quad-cores, FX processors, the RX 7000 series, and the 5080).

And a hit would be a product that catches up to, or goes beyond the competition. Suddenly launch a product that makes the other one feel inferior (like Ryzen did).

So, since this generation Nvidia didn't improve much, if AMD does (vastly) improve RT performance and gets into AI acceleration, all it will need is price to sell like hot cakes.
 
  • Like
Reactions: tamalero
"They won't ever beat Intel" is a phrase that was used a lot 10 years ago, and look where we are. Not saying they can pull a Ryzen trick on GPUs, since these are more evolutionary than revolutionary, and Nvidia is in a very good shape unlike quad-core-max-Intel, but one mistake from green team and one hit from red team could sure turn the game around.
10 years ago, the idea of AMD beating Intel in the CPU space seemed far fetched, yes. While the GPU market is different, being more iterative and dependent on architectural efficiency, driver optimization, and software ecosystems, it's still vulnerable.

NVIDIA’s dominance comes from its tight integration of CUDA, RT cores, and AI accelerators. However, AMD has already proven it can innovate with RDNA and its chiplet approach in CPUs. There was talk of AMD doing the same with GPUs in 2022 but I have not heard anything else about it. Maybe someone else has and can chime in.

If AMD manages to scale its Infinity Cache design, improve ray tracing performance, and offer a DLSS competitor that’s more widely adopted, one strategic misstep from NVIDIA could give AMD the chance to significantly close the gap or even leap ahead in certain segments. It won’t be easy, but the Ryzen precedent shows it’s not impossible.

Problem is, is getting gamers to switch. Some just wont break free from Nvidia, at any cost. History has taught us that as AMD/ATi did have some better options in previous years.
 
It won’t be easy, but the Ryzen precedent shows it’s not impossible.
The Ryzen precedent required Intel to fall on its face and get stuck on the same node for 6 years. That obviously can't happen to Nvidia. In the most unlikely of scenarios if that did happen, AMD would be stuck in the same boat since they both use TSMC and no advantage would be gained. If anything Nvidia is in a better position since they have used Samsung as well recently and don't depend on just one producer like AMD does.
 
  • Like
Reactions: Loadedaxe
The Ryzen precedent required Intel to fall on its face and get stuck on the same node for 6 years. That obviously can't happen to Nvidia. In the most unlikely of scenarios if that did happen, AMD would be stuck in the same boat since they both use TSMC and no advantage would be gained. If anything Nvidia is in a better position since they have used Samsung as well recently and don't depend on just one producer like AMD does.
For now my heart feeling is that all AMD need to do is offer real value, make it something sensible yet still fast enough, like when Nvidia requires you to get a $1000+ msrp card which is never attainable at msrp to play 4K, they need to do it at $600-700, which is a big psychological gate to cross, there’s no use of just $200 cheaper as the % savings and the market dominance will always push ppl to NVIDIA when the price gap is narrow