News AMD says Intel's 'horrible product' is causing Ryzen 9 9800X3D shortages

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
except AMD gpu's are not horrible.
In fact they are good, just not as good as nvidia's.
The same applies to Intel CPU's. The 265K is currently considerably cheaper than the MSRP of a 9800X3D. It is double digit faster in application performance (13%) and 8% slower in 1440p gaming when using the fastest gaming GPU on the planet (not an AMD GPU). If you don't have a 4090, the performance difference at 1440p will be largely negligible. That's not horrible by any definition.

In comparison, AMD's fastest GPU is 17% slower than a 4090 at 1440p, 22% at 4k. So, by Frank Azor's definition of horrible. AMD's GPU's are horrible times 2. Whatever that is. Mr. Azor should not have said what he said, but again, he has a history of this.
 
From the lack of new reports among the millions of Raptor Lake CPUs out there, and the strong reselling prices it appears the degradation issue has been solved.
yea.. too badd its only been what, 2 or 3 months ? lets wait and see in 8 months... a year... or more.. and see if it has been fixed...

but hey, if you feel the issues are fixed... then by all means.. i still have 2 other coworkers that are looking to upgrade, if you are willing to buy them their board and intel cpu,im sure they would stay with intel, if not, they will probably go with amd as they dont feel comfortable buying intel right now, because of those issues... and quite frankly...dont trust intel any more..
 
The same applies to Intel CPU's. The 265K is currently considerably cheaper than the MSRP of a 9800X3D. It is double digit faster in application performance (13%) and 8% slower in 1440p gaming when using the fastest gaming GPU on the planet (not an AMD GPU). If you don't have a 4090, the performance difference at 1440p will be largely negligible. That's not horrible by any definition.

In comparison, AMD's fastest GPU is 17% slower than a 4090 at 1440p, 22% at 4k. So, by Frank Azor's definition of horrible. AMD's GPU's are horrible times 2. Whatever that is. Mr. Azor should not have said what he said, but again, he has a history of this.

The 265k still kinda sucks though. 1440p, with a 4090 and it's barely faster than my 12700k, on average. (less than 3% in techpowerup's review) Arrow lake is a disappointment, period. I had hopes Intel would bring competition back to the table, and they simply didn't deliver.
 
We attended a small roundtable session with AMD executives at CES 2025 in Las Vegas and asked the company for details about its continuing shortages of its flagship gaming-optimized CPU, the Ryzen 9 9800X3D, currently the uncontested best CPU for gaming, and when we can expect demand to improve.

AMD says Intel's 'horrible product' is causing Ryzen 9 9800X3D shortages : Read more
Intel need to swallow their pride and lower the price point prospectively. Nothing wrong with Intel chips, but if they are not in the lead of performance... please, Intel, don't price out your chips as if you are.
 
The 265k still kinda sucks though. 1440p, with a 4090 and it's barely faster than my 12700k, on average. (less than 3% in techpowerup's review) Arrow lake is a disappointment, period. I had hopes Intel would bring competition back to the table, and they simply didn't deliver.
There are only 4 CPU's available that are more than 3% faster than the 265k, and the 265k is the mid tier option in Intel K series option. +/- 3% of the 265k encompasses 22 different CPU's, again with the fastest GPU on the planet. Do all those CPU's "kinda suck?" Anything besides a 4090, and you should be looking at the 4k results. People don't want to listen, but in all but fringe cases, the CPU is practically irrelevant when it comes to gaming. Anything from the last few generations will deliver roughly the same experience.

For application performance, the 265k is 30% faster than a 12700k. For most of the world, that's what matters.
 
The 265k still kinda sucks though. 1440p, with a 4090 and it's barely faster than my 12700k, on average. (less than 3% in techpowerup's review) Arrow lake is a disappointment, period.
The 9950X is less than 3% faster so it must suck too!

The real problem with ARL is the outliers rather than the overall performance. Based on how Intel's updates have barely moved the needle (it looks like a lot of them were really just to fix Win 11 24H2) compared to the update CDPR put out for CP2077 (in HUB's testing ARL went from bottom to top of non-X3D after the update) it seems Intel really should have looked for outliers and contacted developers. Not that everything necessarily could have been fixed, but it's clear Intel was not even remotely proactive with the release of ARL.
 
I highly doubt this for at least three reasons:

One is that at 4K, as we know, the CPU doesn't really matter (using TPU's chart because TH didn't do a 4K test on the 9800X3D review, likely because it'd be a waste of time)

relative-performance-games-38410-2160.png


Two is that even at 1920x1080 and given an unrestricted GPU (the 4090 in this case), even AMD's first X3D CPU, not to mention Intel's 14700K and Ultra 265K, are capable of averaging 120fps, and last generation X3DS chips providing minimum 120fps, while the 9000X3D series are incapable of 240fps, and I'm willing to be the vast majority of monitors are 144hz or less.

UacqtYWFJGCCzZSuS553nV-1200-80.png.webp


And three is that the vast majority of people are using GPUs substantially weaker than the RTX 4090 and are far more GPU limited than CPU, and a CPU upgrade to the 9000X3D would not make a performance difference without a much more expensive GPU upgrade.

Good post, I agree. Also with the direction Nvidia is going, you'll be able to generate 240fps without even needing the CPU to prepare more than 60-120. For a "gaming CPU" and not a productivity CPU, its kind of becoming obsolete for the future already.

For people running the highest end CPU with a low end GPU that brag about a theoretical 200fps+, well they probably aren't interested in graphics anyway, if they keep even turn that much down GPU wise to hit CPU max.
 
L3 Cache takes 1.60W currently at idle, at 32.3C. Package temp is 40.9C. With an air cooler, at currently low speed.

Temps could be better, I suppose. But that comes down to me not having upgraded the cooling (yet), I used with a 7600X. The 7600X has a TDP of 105W, and the 9800X3D has a TDP of 120W (with two more cores than the 7600X, with now better "power efficiency" per core). That wasn't much a jump there, for a cooling upgrade. Might look a bit different though, with a new GPU at likely higher wattage, if the air flow provided by the current case fans will be a bit too weak.

Which isn't to say that other CPUs run hotter. But this CPU looks quite fine to me, so not sure, what some are on about...

By the way, back some years ago, why I went with AMD-chipset MB was probably foremostly, because it already offered more than 16 PCIe 5.0 lanes, that all get to be used at the same time, such as for a 5.0 GPU, and for a gen5x4 M.2 slot. And back then, the prospect of that I get to use the same MB for a CPU upgrade to next gen, that made a strong selling point as well. So... just saying, that there are also some metrics not directly about FPS output.

I gladly took the L3 cache along with the 9800X3D though (after I had 6 AM shift and apparently got to go online earlier, to purchase one for curbside pickup). Improved Low 1% is especially nice, when the average output of a GPU is below 100 (such as the RX 6700 XT at a 1440p density for a number of games these days), as an eventual drop in FPS is hardly noticeable, when the low is around 60 or so (like, no stutter visible, as could be the case when the Low 1% is very low). And I won't mind increased FPS with new GPU, with likely low to no CPU bottleneck there, at least as far as the soon-to-be-released GPUs are concerned. If a new GPU can churn out some 200 of raw FPS at basic 1440p, then that's (simplified) 60+ FPS per screen on a three screen setup (plus some due to FSR or DLSS), and at that point the question about Low 1% reappears once again, as whether the rig can handle a smooth gaming experience at such a setup (or the option thereof).
 
When I see the word Game cpu... I feel so sad...
Even my calculator can play doom!
Now i have a Game calculator market lol.
Cpu is Cpu and a 9800x3d cpu is a very weak cpu. The King from amd is the 9950x real cpu lots of core and can play a game.
 
What do you consider "horrible".

Market share says they are horrible. 😉
Because Nvidia offers incentives for game devs to use their technology. aka "Nvida GameWorks" It ain't because the technology is necessarily better than what AMD offers.
I hope this explains why Nvidia became rich from gaming. It consistently creates software to make games more hardware-demanding. This way, Nvidia is guaranteed to always have an easy time selling its latest $750+++ video cards. The close cooperation between Nvidia and game developers goes back many yrs already , it's so cosy & snuggly it leaves AMD's attempts to influence game devs out to dry.
But if your happy with the monopoly Nvidia has over game devs, then there's that...
 
  • Like
Reactions: stuff and nonesense
Because Nvidia offers incentives for game devs to use their technology. aka "Nvida GameWorks" It ain't because the technology is necessarily better than what AMD offers.
I hope this explains why Nvidia became rich from gaming. It consistently creates software to make games more hardware-demanding. This way, Nvidia is guaranteed to always have an easy time selling its latest $750+++ video cards. The close cooperation between Nvidia and game developers goes back many yrs already , it's so cosy & snuggly it leaves AMD's attempts to influence game devs out to dry.
But if your happy with the monopoly Nvidia has over game devs, then there's that...
This is a wildly inaccurate view of the current gaming landscape. This type of behavior kept nvidia ahead in the bad times, but most of the marketshare gains were simply while they were providing a better product or better perf/price. Currently nvidia provides a superior overall software suite and ray tracing to AMD. Unfortunately AMD's baggage has stuck with them from the ATI days even if it doesn't apply.

AMD's console wins have absolutely had a pretty large impact on game design. The limitation of ray tracing implementation is the most obvious one. They also have multiplatform titles they work with developers on like nvidia does.

Quite frankly if AMD wanted more marketshare they'd price aggressively. They haven't done so yet, but I hope with RDNA 4 they'll be delivering equivalent (or better) ray tracing and great pricing.
 
Because Nvidia offers incentives for game devs to use their technology. aka "Nvida GameWorks" It ain't because the technology is necessarily better than what AMD offers.
I hope this explains why Nvidia became rich from gaming. It consistently creates software to make games more hardware-demanding. This way, Nvidia is guaranteed to always have an easy time selling its latest $750+++ video cards. The close cooperation between Nvidia and game developers goes back many yrs already , it's so cosy & snuggly it leaves AMD's attempts to influence game devs out to dry.
But if your happy with the monopoly Nvidia has over game devs, then there's that...
I agree to an extent, not so much about the incentives in todays world but I digress, however.... lets remember. AMD has had 19 years to right the ship ATi let take on water by Nvidia pouring it in.

Nvidia, while, its a love/hate relationship is where a lot of gamers and creators bend the knee to. Until AMD can edge them or at least get close, and I do not mean just in software development, we are stuck with what we have...like it or not.

The only way AMD is going to be more relevant is by coming in with a very competitive price at launch, maybe this round is better. We will see. But I am not holding my breathe. They always drop in right under Nvidia and everyone just buys Nvidia, then 6 mo-1 year they drop prices....by then its too late.
 
Last edited:
the most hilarious thing is intel has 4 warehouses FULL of unsold Arrow Lake CPUs. They're trying to take away manufacturing capability from Arm and AMD, because they can't really compete since arrow lake is a bugged to hell pile of crap with stability issues and extremely low price/performance ratios.
 
AMD always bragging how their latest CPU will beat Intel, then when it does they can't ramp production and blame Intel. Shouldn't they have been preparing for this? Someone at AMD really dropped the ball.
Or they are have issues with yields and the factory processes and can't produce enough good chips <-- common issue on modern CPU launches.
But blame Intel for their failures. Really?
 
  • Like
Reactions: KyaraM
the most hilarious thing is intel has 4 warehouses FULL of unsold Arrow Lake CPUs. They're trying to take away manufacturing capability from Arm and AMD, because they can't really compete since arrow lake is a bugged to hell pile of crap with stability issues and extremely low price/performance ratios.
source?
 
I suspect AMD's X3D availability is actually currently limited by their choice to use their limited TSM advanced packaging allocation to build $10K+ AI GPUs rather than consumer gaming CPUs. Probably a reasonable gamble at the time ...

In retrospect, AI model sizes have outgrown their first gen MI3xx chips already, and their AI GPU market share is reportedly single digit.