AMD's datacenter business unit reports record sales, leaves Intel behind.
AMD outsells Intel in the datacenter for the first time in Q4 2024 : Read more
AMD outsells Intel in the datacenter for the first time in Q4 2024 : Read more
isnt that a completely different segment?...
But only if you exclude the additional 1.6bil sales Intel did in Network and Edge which is a big part of datacenter operations....
AMD made more net money from datacenter than intel did, but the article says outsells not outprofits.
"They won't ever beat Intel" is a phrase that was used a lot 10 years ago, and look where we are. Not saying they can pull a Ryzen trick on GPUs, since these are more evolutionary than revolutionary, and Nvidia is in a very good shape unlike quad-core-max-Intel, but one mistake from green team and one hit from red team could sure turn the game around.AMD > Intel
They won't ever beat Nvidia the way things are stacked already, but they could do much better if they stopped playing catch up. Just Give gamers a great rasterization card for a great price and we are good
Actually right till now some are still claiming the Arrow lakes are the best available and AMD only is second best, and evidence is that datacenters and coporate still very lean to intel chips, now waiting for the fake news claimers chime in"They won't ever beat Intel" is a phrase that was used a lot 10 years ago, and look where we are. Not saying they can pull a Ryzen trick on GPUs, since these are more evolutionary than revolutionary, and Nvidia is in a very good shape unlike quad-core-max-Intel, but one mistake from green team and one hit from red team could sure turn the game around.
in past sure, but now? thats likely not going to work.Just Give gamers a great rasterization card for a great price and we are good
and what exactly would a mistake be?but one mistake from green team and one hit from red team could sure turn the game around.
There are many gamers out there (including me!) who do NOT care about Raytracing and other new fancy crap like fake frames and other AI stuff. Make cards for those gamers that won't cost a leg and an arm. There's a reasonable, maybe even HUGE market out there for them!AMD > Intel
They won't ever beat Nvidia the way things are stacked already, but they could do much better if they stopped playing catch up. Just Give gamers a great rasterization card for a great price and we are good
Intel is still cutting labor and restructing; they've always been a much larger company than AMD (although it's the smallest gap ever now), so when revenues are similar between the two, you bet AMD will have better margins on that revenue. Next fiscal year should show better margins for Intel, assuming they can stop the DCAI hemorrhaging .To me the real shocker is AMD's gross margins beating out Intel's.
I get where you are coming from, but is basically what AMD tried with RDNA2 cards and they didn't really gain any ground at all against Nvidia. And game devs are pushing more and more mandatory raytracing stuff into their games (and also expect the next generation of consoles to want more raytracing stuff as well). AMD's future cards are going are being unified with the high end data center stuff again (UDNA is what they are calling the next architecture), so they will have the compute resources to do the AI stuff as well.There are many gamers out there (including me!) who do NOT care about Raytracing and other new fancy crap like fake frames and other AI stuff. Make cards for those gamers that won't cost a leg and an arm. There's a reasonable, maybe even HUGE market out there for them.
"Graphics are good enough, stop developing new functions!" has not once worked in the half-century history of real-time graphics, and it's not going to suddenly change now.There are many gamers out there (including me!) who do NOT care about Raytracing and other new fancy crap like fake frames and other AI stuff. Make cards for those gamers that won't cost a leg and an arm. There's a reasonable, maybe even HUGE market out there for them!
AMD's data center segment according to AMD:isnt that a completely different segment?
AMD does not sell network devices.
The Data Center segment primarily includes server microprocessors (CPUs), graphics processing units (GPUs), accelerated processing units (APUs), data processing units (DPUs), Field Programmable Gate Arrays (FPGAs), Smart Network Interface Cards (SmartNICs), Artificial Intelligence (AI) accelerators and Adaptive System-on-Chip (SoC) products for data centers.
maybe, but he isnt wrong. quite a few people i work with, and know dont care about RT as well, at least not yet. what they really dont care about., is DLSS and frame gen, those are just crutches that nvidia is using to make up the performance they not giving us, by gimping their cards. when RT can be usefully used on card that isnt a xx90 card, and cost 400 or less.... then it might be worth it..."Graphics are good enough, stop developing new functions!" has not once worked in the half-century history of real-time graphics, and it's not going to suddenly change now.
As the game industry continues to transition to ray tracing, it won't matter whether you care or not. If you want to play current releases, you're going to have to have a GPU with solid ray tracing performance.maybe, but he isnt wrong. quite a few people i work with, and know dont care about RT as well, at least not yet. what they really dont care about., is DLSS and frame gen, those are just crutches that nvidia is using to make up the performance they not giving us, by gimping their cards. when RT can be usefully used on card that isnt a xx90 card, and cost 400 or less.... then it might be worth it...
why do you think there are some that are saying since the 40 series, the naming of their cards have gone up a tier, while the price matches that tier but the specs, and even the performance seems to be a tier lower ?
AMD makes smart NICS?? what?AMD's data center segment according to AMD:
and if you dont play games with RT ?? hence why those i work with and know, dont care about it.. but you also seem to glance over the part of " right now " that i also said...As the game industry continues to transition to ray tracing, it won't matter whether you care or not.
and that still seems to be a xx90 series card, and maybe an xx80 series... and the xx90 cards are way to overpriced and expensive, and anything else, isnt worth the moneyIf you want to play current releases, you're going to have to have a GPU with solid ray tracing performance.
Yes, they bought Xilinx (finalized in 2022) which is where a decent amount of that data center income is coming from.AMD makes smart NICS?? what?
It's going to matter for any GPU that someone buys today unless you're among the select few that replaces their GPU every year or two. 3, 4, 5 years down the line, ray tracing is going to become the standard and games like Indiana Jones will be the norm, not the exception.and if you dont play games with RT ?? hence why those i work with and know, dont care about it.. but you also seem to glance over the part of " right now " that i also said...
and that still seems to be a xx90 series card, and maybe an xx80 series... and the xx90 cards are way to overpriced and expensive, and anything else, isnt worth the money
good thing other then what was it.. indiana jones... games have the option to turn that off.
They gave up big GPUs, but not top tier technology.and what exactly would a mistake be?
AMD gave up high market.
even a "mistake" likely wouldnt do much at this point.
AMD's only real advantage is they don't skimp on VRAM like nvidia does.
on CPU side AMD is effectively untouched as Intel's the 1 playing catch up (thoguh nvidia might enter that field as they are supposedly working on ARM cpu's)
10 years ago, the idea of AMD beating Intel in the CPU space seemed far fetched, yes. While the GPU market is different, being more iterative and dependent on architectural efficiency, driver optimization, and software ecosystems, it's still vulnerable."They won't ever beat Intel" is a phrase that was used a lot 10 years ago, and look where we are. Not saying they can pull a Ryzen trick on GPUs, since these are more evolutionary than revolutionary, and Nvidia is in a very good shape unlike quad-core-max-Intel, but one mistake from green team and one hit from red team could sure turn the game around.
The Ryzen precedent required Intel to fall on its face and get stuck on the same node for 6 years. That obviously can't happen to Nvidia. In the most unlikely of scenarios if that did happen, AMD would be stuck in the same boat since they both use TSMC and no advantage would be gained. If anything Nvidia is in a better position since they have used Samsung as well recently and don't depend on just one producer like AMD does.It won’t be easy, but the Ryzen precedent shows it’s not impossible.
For now my heart feeling is that all AMD need to do is offer real value, make it something sensible yet still fast enough, like when Nvidia requires you to get a $1000+ msrp card which is never attainable at msrp to play 4K, they need to do it at $600-700, which is a big psychological gate to cross, there’s no use of just $200 cheaper as the % savings and the market dominance will always push ppl to NVIDIA when the price gap is narrowThe Ryzen precedent required Intel to fall on its face and get stuck on the same node for 6 years. That obviously can't happen to Nvidia. In the most unlikely of scenarios if that did happen, AMD would be stuck in the same boat since they both use TSMC and no advantage would be gained. If anything Nvidia is in a better position since they have used Samsung as well recently and don't depend on just one producer like AMD does.