News Ryzen 7 9800X3D left Core-14900K in the dust in Battlefield 6 early streamer tests — both systems included an RTX 5080, but the 3D V-Cache system w...

Admin

Administrator
Staff member
Well, the older Intel CPU is because it is the best "gaming" CPU Intel has produced. It would be even less fair to compare the current gen Intel against the AMD. The amount of performance difference is what is newsworthy here.

Not really. The 9800X3D is about 33% faster at 1920x1080 gaming than the 14900K per TH's review with some games scoring upwards of 80fps higher with the 9800X3D than the 14900K, and the missing "about 110 FPS" is roughly a third of the 330fps, so it's about in line with existing results, with the difference no doubt dropping significantly at 2560x1440 and then to negligble as you approach 4K as is typical as the importance of the CPU drops compared to the GPU.

UacqtYWFJGCCzZSuS553nV-1200-80.png.webp
 
Not really. The 9800X3D is about 33% faster at 1920x1080 gaming than the 14900K per TH's review with some games scoring upwards of 80fps higher with the 9800X3D than the 14900K, and the missing "about 110 FPS" is roughly a third of the 330fps, so it's about in line with existing results, with the difference no doubt dropping significantly at 2560x1440 and then to negligble as you approach 4K as is typical as the importance of the CPU drops compared to the GPU.

UacqtYWFJGCCzZSuS553nV-1200-80.png.webp
You are missing his point. If they compared it against the latest gen Intel (which is ironically worse than the previous gens) the gap would be even larger. Frankly hardly any point in comparing Intel to AMD at all on the higher end, They aren’t even in the running. X3D chips, even the previous gens are miles ahead. Intel needs to get it together, they have a couple years at most before they age into irrelevance. A new CEO was definitely needed but I’m not sure this new one is going to save them.
 
You are missing his point. If they compared it against the latest gen Intel (which is ironically worse than the previous gens) the gap would be even larger. Frankly hardly any point in comparing Intel to AMD at all on the higher end, They aren’t even in the running. X3D chips, even the previous gens are miles ahead. Intel needs to get it together, they have a couple years at most before they age into irrelevance. A new CEO was definitely needed but I’m not sure this new one is going to save them.

There's no point in comparing AMD 9000 series or 7000 series X3D to Intel anything when it comes to gaming at 1920x1080, they're always going to come out on top in the benchmarks.

But benchmarks aren't real life, and as Techspot just showed a few days ago when they compared a high end gaming oriented CPU to a mainstream non gaming oriented CPU across 4 GPUs ranging from the ultra high end 5090 to the mainstream 9060XT, you're not really losing anything at the mainstream level by using a mainstream non gaming CPU, a 9060XT or 9070 is just as happy with a 7600X as a 9800X3D even at 1920x1080, it's only at the upper midrange 5080 and high end 5090 that the differences show, and there are far more people who fall in the 9070 area than the 5080 and above.



AV-1M-p.webp


AV-1U-p.webp
 
There's no point in comparing AMD 9000 series or 7000 series X3D to Intel anything when it comes to gaming at 1920x1080, they're always going to come out on top in the benchmarks.

But benchmarks aren't real life, and as Techspot just showed a few days ago when they compared a high end gaming oriented CPU to a mainstream non gaming oriented CPU across 4 GPUs ranging from the ultra high end 5090 to the mainstream 9060XT, you're not really losing anything at the mainstream level by using a mainstream non gaming CPU, a 9060XT or 9070 is just as happy with a 7600X as a 9800X3D even at 1920x1080, it's only at the upper midrange 5080 and high end 5090 that the differences show, and there are far more people who fall in the 9070 area than the 5080 and above.



AV-1M-p.webp


AV-1U-p.webp
You are right there’s no point because Intel is irrelevant outside of low end use.

They compared upper mid end intel to upper mid amd. It’s a fair comparison. People running 5070 or older/lower wouldn’t likely be looking at these CPU’s anyway. They would be looking at the lower range ones. These are more 5080 range CPU’s.
 
Last edited:
  • Like
Reactions: John Nemesh
Newer gaming focused CPU beats older non-gaming focused CPU in gaming.

In 2025 does anyone really need to be reminded that X3D chips perform better in gaming than any non-X3D chip?
The 9800X3D and the 14900k are currently the best performing gaming CPUs from AMD and Intel respectively, so yes, it's totally relevant to compare them. Intel users were not complaining when their favorite brand was constantly beating AMD, they never had enough comparison benchmarks. Now that the table has turned, they are all like "ok ok we saw it, please can you stop rubbing it in our face". Hilarious.
 
  • Like
Reactions: Energy96
That’s not true. The X3D chips run hot when you load all the cores to 100%, but in gaming they run impressively cold. My 9800X3D rarely goes above 60c when gaming (it's most of the time in the 50-55c range).
That’s my experience as well on my 9950x3d. Only really cpu intensive games and heavy productivity compute put me over 60c. In games it’s usually 50-55 with occasional spike demand to 60. It runs very cool and takes a lot to get it hot. I’ve yet to see it get anywhere near max.
 
Last edited:
  • Like
Reactions: John Nemesh
That’s my experience as well on my 9850x3d. Only really cpu intensive games and heavy productivity compute put me over 60c. In games it’s usually 50-55 with occasional spike demand to 60. It runs very cool and takes a lot to get it hot. I’ve yet to see it get anywhere near max.
Yeah I think the extra cache reliefs some load on these chips since they need to request data from the memory at a much lower rate, so they run less hot.
 
The 9800X3D and the 14900k are currently the best performing gaming CPUs from AMD and Intel respectively, so yes, it's totally relevant to compare them. Intel users were not complaining when their favorite brand was constantly beating AMD, they never had enough comparison benchmarks. Now that the table has turned, they are all like "ok ok we saw it, please can you stop rubbing it in our face". Hilarious.

For me, as someone who's only used AMD for the last 20 years even though it meant suffering through 15 years of sub-par equipment with a couple of exceptions, it's the fact that TomsHardware isn't doing the testing themselves, the article authors are going to social media and Youtube and reporting on testing someone else has done, which may or may not be reliable, without even repeating the test to verify it. They didn't even do a CPU/GPU scaling test like TechSpot did (referenced above) all while promoting a few dollars off a 9800X3D on the front page like it's a huge deal when the real world result even with a 5080 with max details at 1920x1080 isn't that great compared to a previous generation non gaming CPU.

Maybe it's budget cuts because TomsHardware is 4th in tech in the Future PLC "Key Brands" display behind Tech Radar, PC Gamer, and Tom's Guide meaning more sponsored and shadow sponsored posts and less real content, but I for one would love to see more substance and less "We saw it on social media/Youtube and here's what THEY said!" articles like this.

According to Future PLC: TomsHardware has 8.7M monthly users and 531K social followers, Tech Radar has 24.1M users and 2.3M followers, and TomsGuide has 30M users and 693K followers. Far from the worst tech brand, but well short of the top, and "We saw this on social media" articles aren't helping.

 
Last edited:
Not really. The 9800X3D is about 33% faster at 1920x1080 gaming than the 14900K per TH's review with some games scoring upwards of 80fps higher with the 9800X3D than the 14900K, and the missing "about 110 FPS" is roughly a third of the 330fps, so it's about in line with existing results, with the difference no doubt dropping significantly at 2560x1440 and then to negligble as you approach 4K as is typical as the importance of the CPU drops compared to the GPU.

UacqtYWFJGCCzZSuS553nV-1200-80.png.webp
Also note that this graph is a launch review of the Ultra series. There have been many performance improvements (particularly for gaming) for the 200 series since launch. I'd say that there may have been a 3-5% performance improvement with all the tweaks, putting the 285K with CUDIMMs firmly ahead of the 14900K.
 
There's no point in comparing AMD 9000 series or 7000 series X3D to Intel anything when it comes to gaming at 1920x1080, they're always going to come out on top in the benchmarks.

But benchmarks aren't real life, and as Techspot just showed a few days ago when they compared a high end gaming oriented CPU to a mainstream non gaming oriented CPU across 4 GPUs ranging from the ultra high end 5090 to the mainstream 9060XT, you're not really losing anything at the mainstream level by using a mainstream non gaming CPU, a 9060XT or 9070 is just as happy with a 7600X as a 9800X3D even at 1920x1080, it's only at the upper midrange 5080 and high end 5090 that the differences show, and there are far more people who fall in the 9070 area than the 5080 and above.



AV-1M-p.webp


AV-1U-p.webp
It's just in general... I always quote flight sim as one of the gaming scenario where the main thread performances can drag everything down from 60fps to sub 20 or even 10 FPS range due to main thread limitation, and ironically, such games are the real reason ppl shop for a new (gaming) CPU, even at 1080p, or else even 11th gen i7 is plenty for most games with TOTL GPU at 4k
 
"extra L3 cache in the AMD CPU was being utilized very well by the game engine"

That's not how it works: It's not that games utilize cache L3 cache well. It's badly-coded games that are helped by the existence of 3D cache.

In a nutshell: Badly-coded games lose assets from the cache when cache overfills . When those assets need to be accessed, they are pulled from slow RAM instead of fast L3. A larger L3 means that those assets are more likely to still be on L3 instead of RAM, thus decreasing the chances of performance penalty when those assets need to be retrieved.

This behavior was explained by some industry expert (I think it might have been a game dev) in one of MLID's videos.
 
Maybe it's budget cuts because TomsHardware is 4th in tech in the Future PLC "Key Brands" display behind Tech Radar, PC Gamer, and Tom's Guide meaning more sponsored and shadow sponsored posts and less real content, but I for one would love to see more substance and less "We saw it on social media/Youtube and here's what THEY said!" articles like this.

According to Future PLC: TomsHardware has 8.7M monthly users and 531K social followers, Tech Radar has 24.1M users and 2.3M followers, and TomsGuide has 30M users and 693K followers. Far from the worst tech brand, but well short of the top, and "We saw this on social media" articles aren't helping.

If this was the good ol days, a random Tom's Magazine writer would be rushing home and whipping out his typewriter to start a "news piece" for the next edition of their magazine based on an overheard conversation in a bar from some random drunk dude raving about how he is testing a dev build of Doom and he got some 5 fps faster on some 3dfx GPU that he cannot reveal because of embargoes...even though he is definitely drunk.

If you don't get it, basically no one does this, that how low TH has fallen.