Review AMD Ryzen 7 9800X3D Review: Devastating Gaming Performance

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

YSCCC

Commendable
Dec 10, 2022
566
460
1,260
Everyone mentions that stutter. When or if it happens it's just a microsecond. It's not about to cost you a kill during your COD session. Also that micro stutter will happen with every CPU on the market more than likely.
If you play only COD, sure, but there are games that stuttering will make it feels disconnected, and tbf, there is no harm to choose a better gaming CPU to minimize if not eliminate those stutters during a build for a gaming PC.

Testing at 1080P or even lower gives one the idea of how much better one CPU in gaming load vs the other, so unless you change the whole system every gen, it will make your CPU likely to sustain more generations before it started bottlenecking the GPU at 4K.
 
  • Like
Reactions: atomicWAR and 72jd

Kulkiet

Distinguished
Jun 5, 2015
24
1
18,515
View: [url=https://www.youtube.com/watch?v=98RR0FVQeqs[/URL]u]https://www.youtube.com/watch?v=98RR0FVQeqsu[/url] compare ryzen 3600 to ryzen 7th ? isnt 14700k to 9800 3dx ,,in 1080p matter but higher res is minimal difference .most of games with this kind of budget dont play at 1080 .even esport dont use 500 fps,,,, so what is the point of this review and comments with crashing opponent https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/19.html for 1440p is 5% and at 4k is 1 % ..
 
I have a relatively small one, palit gamerock, it fits into a bunch of itx cases. Problem is my current PSU does not, it's freaking huge (be quiet power pro 12 1200w). But im planning for a 2nd PC anywawys so ill see, currently looking at the jonsbo d41 case that can fit a bigger cooler anyways.
Let me know if you want some general parts suggestions when time comes, not that I believe you need it.
 
  • Like
Reactions: TheHerald

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
Let me know if you want some general parts suggestions when time comes, not that I believe you need it.
Well im currently looking for an ITX / matx mobo that can reliably hit over 8k ram but t hey just don't exist. The x870 strix I supposedly can, but my pals are saying it struggles at 8k, no way you can manually tune timmings up there. Quite a bummer.
 
  • Like
Reactions: helper800

squuiid

Distinguished
Jun 6, 2015
4
0
18,510
Paul, very frustrating that you didn’t include the 245K in this review for comparison. From a power perspective this is a good chip for SFF ITX builders. Would have been nice to see how it compared to the new gaming champ in that area specifically.
 

squuiid

Distinguished
Jun 6, 2015
4
0
18,510
Well im currently looking for an ITX / matx mobo that can reliably hit over 8k ram but t hey just don't exist. The x870 strix I supposedly can, but my pals are saying it struggles at 8k, no way you can manually tune timmings up there. Quite a bummer.
The x870 ITX offerings are shockingly poor. Why is ASUS literally the only one making one?! I hate their external module.
I’m having to look at Intel and the 245K to be able to get a decent, modern ITX mobo that is cool enough in an ITX case, such as the Formd T1. Disappointing.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
The x870 ITX offerings are shockingly poor. Why is ASUS literally the only one making one?! I hate their external module.
I’m having to look at Intel and the 245K to be able to get a decent, modern ITX mobo that is cool enough in an ITX case, such as the Formd T1. Disappointing.
The asus one is shocking too, it's basically a b650 rebrand at ~70% more money. Doesn't even get nitropath. Im kinda shocked by the am5 ITX mobos and im probably going for ATX.
 

Phaaze88

Titan
Ambassador
compare ryzen 3600 to ryzen 7th ? isnt 14700k to 9800 3dx ,,in 1080p matter but higher res is minimal difference .most of games with this kind of budget dont play at 1080 .even esport dont use 500 fps,,,, so what is the point of this review and comments with crashing opponent https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/19.html for 1440p is 5% and at 4k is 1 % ..
you just confirmed it at 1080 big difference, who plays game at 1080 with 4090,,, if you crank up resolution difference is minimal for gaming at least
Steve explained it in the final thoughts: it's the framerate - those 1% lows - that's important, not the resolution.
How many frames can one get at 4K, and to know that, you need to know the max performance of the cpu... but testing at 4K doesn't really tell one that... Do I have to type the entire final thoughts section?


I REALLY wish reviewers didn't post fps averages - no, not a game's fps averages, those are fine - it's the 'average across X number of games' results that are a load of crap. [I will admit, that I used to eat that crap up too.]
What % of people do those sections accurately represent?
I don't play any of the games HWUB and TPU sample.
What about someone who only plays a few of the titles sampled, and they decided to check their own game averages to compare against, only to find their results aren't on par?
How about those who just play one game, like Counter-Strike 2, for example? The difference from that one is going to be far higher than 1-5%.
Reviewers don't benchmark the same games, or bench in the same areas of games. Heck, some settle for in-game benches, which may not be an accurate run of the game(some only push the gpu).
 
  • Like
Reactions: YSCCC

salgado18

Distinguished
Feb 12, 2007
977
434
19,370
Steve explained it in the final thoughts: it's the framerate - those 1% lows - that's important, not the resolution.
How many frames can one get at 4K, and to know that, you need to know the max performance of the cpu... but testing at 4K doesn't really tell one that... Do I have to type the entire final thoughts section?


I REALLY wish reviewers didn't post fps averages - no, not a game's fps averages, those are fine - it's the 'average across X number of games' results that are a load of crap. [I will admit, that I used to eat that crap up too.]
What % of people do those sections accurately represent?
I don't play any of the games HWUB and TPU sample.
What about someone who only plays a few of the titles sampled, and they decided to check their own game averages to compare against, only to find their results aren't on par?
How about those who just play one game, like Counter-Strike 2, for example? The difference from that one is going to be far higher than 1-5%.
Reviewers don't benchmark the same games, or bench in the same areas of games. Heck, some settle for in-game benches, which may not be an accurate run of the game(some only push the gpu).
They can't just test every game out there, it has to be a varied but fixed selection. It will happend that they chose a game you don't play, but you can extrapolate by game engine for example (no one will benchmark Satisfactory on a CPU review (maybe they should), but I can look at the closes Unreal game).

The percentage is a way to have a single score. How else would we compare all these CPUs then? If you only play one game, look for the closest one, or even search for people playing them on youtube. That's also a good way to find modern games tested on old or uncommon hardware.
 

ilukey77

Reputable
Jan 30, 2021
808
332
5,290
One review i saw offered 1440p and 4k results obviously there were basically no gains at 4k because its GPU limited ..

There was small gains across some games in 1440p VS the 7800x3d 1% lows were also improved !!

personally im going to wait till the 9950x3d this time as some of them games seemed to surprise the reviewers where they didnt think the games fps could go that high ..

So if the 9950x3d has more cache and they can sort the stacking and issues that the 7950x3d had ( being virtually a more core 7800x3d ) then i think the 9950x3d will be the king of both gaming and production ..

If not i lose nothing my 7800x3d when matched with a 5090 ( ditching AMD GPU's next gen i think ) should play everything 1440 4k with no issues!!

Then i might look at a 9800x3d when the price has dropped !!

But we need Intel to fight back and SOON !!

Intel need 3d v cache and be trading blows with AMD in performance and efficiency !!

You think that $30usd price increase is because its cost more HELL NO its because AMD will become the next Nvidia if left un challenged !!

They know full well Intel have nothing to compete so watch the incoming price hike on all there CPU's !!
 

Phaaze88

Titan
Ambassador
They can't just test every game out there, it has to be a varied but fixed selection.
I'm well aware they can't. I'm saying that the 'fps average across number of games' shouldn't be there, as it's misleading information.
There are several channels not posting that - Gamers' Nexus, KitGuru, der8auer, and Level1Techs, to name some.
 

YSCCC

Commendable
Dec 10, 2022
566
460
1,260
One review i saw offered 1440p and 4k results obviously there were basically no gains at 4k because its GPU limited ..

There was small gains across some games in 1440p VS the 7800x3d 1% lows were also improved !!

personally im going to wait till the 9950x3d this time as some of them games seemed to surprise the reviewers where they didnt think the games fps could go that high ..

So if the 9950x3d has more cache and they can sort the stacking and issues that the 7950x3d had ( being virtually a more core 7800x3d ) then i think the 9950x3d will be the king of both gaming and production ..

If not i lose nothing my 7800x3d when matched with a 5090 ( ditching AMD GPU's next gen i think ) should play everything 1440 4k with no issues!!

Then i might look at a 9800x3d when the price has dropped !!

But we need Intel to fight back and SOON !!

Intel need 3d v cache and be trading blows with AMD in performance and efficiency !!

You think that $30usd price increase is because its cost more HELL NO its because AMD will become the next Nvidia if left un challenged !!

They know full well Intel have nothing to compete so watch the incoming price hike on all there CPU's !!
Personally I think the 1% lows are all that important, and one could reference the lower resolution results to choose your GPU also, as per game title tested I think it's irrelevant to a point unless you only play one game or one genere in the same rig and change game with the rig also, otherwise a good gaming CPU beside getting the most out of the current hardware, can let the platform sustain considerably more generations of games to come. like for top end GPU, PCIE 5.0 X16 could likely not bottleneck until 6090 and 7080 or even 7090 given the current gen on gen GPU performance improvement, and if said platform don't bottleneck the GPU at 720p or 1080p, in 4 years time using a 7090 likely can fully utilize it to play games perfectly at 4k, while some lesser performing CPUs might bottleneck the 7090.

And testing those at stock XMP or even JDEC ram could make the system referencable for everyone doing a build, not to claim some uber, ultra, max optimized ram setups which isn't guaranteed to be replicable.
 

abufrejoval

Reputable
Jun 19, 2020
582
422
5,260
Appreciate the review. I literally just bought a Ryzen 9 7950X3D. I will be using it primarily for flight simulation at a flight school, with a secondary use as an office PC. Did I make a mistake? Should I return the 9 7950X3D for the 7 9800X3d?

I bought mine in early Summer, because I needed a "new" system for one of my kids, which in this case meant that I needed to replace the 5800X3D, I'd pass on :)

So I no longer have the option to return it for free and selling it now is not only a bother, but perhaps not the greatest trade.

I'd say that at least a Ryzen 9 7950X3D won't put you at a noticeable disadvantage vs a 9800X3D. 8% better gaming performance isn't a giant gap and might actually be somewhat less with a 7950X3D since those CCDs on those tend to be better bins than single CCD CPUs.

And in terms of productivity workloads the non-V-cache CCD on the 7950X3D might match the 9800X3D for up to 8 cores, because it can clock higher, and then has another 8 cores to throw at the job: I observe both CCDs running at the very same clocks on full loads, so the V-Cache CCD on the 7950X3D doesn't really suffer from lower clocks at this point, because they all have to give up some Watts to stay within socket limits. That's the official reason I chose the 7950X3D over the 7950X. The unofficial was that sometimes I use my RTX 4090 CUDA workstation sometimes for gaming after hours (and the theoretical ability to test some HPC workloads that profit from the bigger cache).

The real kicker might be a 9950X3D, though, but there I'm not even sure that AMD will actually produce it.
Nor am I sure which form it might take.

They could make it with dual V-cache CCDs, because with the new cache placement productivity won't suffer as much any more, but that wouldn't fly as a gaming chip: nobody is going to create bespoke games to match a dual CCD V-cache CPU and that's what it would take to come out on top. Games that are completely unaware of the dual CCD topology are more likely to suffer than profit.

It could fill a nice niche in some HPC workloads (e.g. EDA or genomics), but AMD wouldn't want to fill a premium niche with economy parts, when there is no competitor biting its heels. And real HPC isn't done on desktop systems.

And while I'd buy another hybrid variant even for the 9950X3D (if dual V-cache wasn't available and I had to buy new), AMD might still decide not to produce that... a) because they don't need to and b) because they might get more bad press than happy customers.

The hybrid CCD chips require matching workloads and intelligence to use optimally, and both can be scarce, while press and social network vitriol is generated in abundance.

I'd really just advise you to relax (which I also tell myself): real-life gaming performance isn't typically bottlenecked by CPU performance these days. Nearly everything you can buy today, and quite a few CPUs far older are quite capable of handling any game currently out there, while games with more exceeding CPU demands would face a customer niche far too small to make them worth designing: mainline is still defined by the consoles and those concentrate their effort on the GPU side of things, where the actual gains are.

I bought a CPU monster, because it earns its money doing CPU things. The V-cache was just a bit of a cream topping that didn't sacrifice CPU headroom significantly.

And if AMD won't do 9950X3D chips of either kind, the 7950X3D might just see a bit of a bloom on the used parts market later!
 
Mar 10, 2020
414
376
5,070
A thought … AMD going forward.
With the 5000x3d we had access to a chip that ran hotter due to the assembly but was relatively easy to assemble. Slap the cache on the top…. It worked.
Bring it out later, people go WOW… it’s so much faster.

7000x3d… more of the same and then came cache v2 with the 9000x3d. It is still clocked a little lower than its non x3d counterparts but it’s closer. The big thing is that the “testers”, enthusiasts, now have access to overclocking, Scatterbench have a video claiming in excess of 5.7GHz…. The overclockers will be able to prove the reliability of the parts (and ruin their warranties).

Should the performance of the 9000x3d parts be reliable at the same settings as the non x3d processors over time would AMD consider making 3d cache standard?
They have experience over 3 generations of processors, they know its base characteristics and where the low hanging fruit is for improvements (get the clocks up)..

Just a thought
 

abufrejoval

Reputable
Jun 19, 2020
582
422
5,260
"devastating game performance"?

Intel might be devastated, but I'd say it looks jolly good for everyone on the lookout for a new gaming rig.

Please dial back those headlines, you have fewer and fewer competetitors to outshout these days and even the elections are over.
 

abufrejoval

Reputable
Jun 19, 2020
582
422
5,260
Should the performance of the 9000x3d parts be reliable at the same settings as the non x3d processors over time would AMD consider making 3d cache standard?
They have experience over 3 generations of processors, they know its base characteristics and where the low hanging fruit is for improvements (get the clocks up)..
I'm sure they are considering all options.

I'm pretty sure that making it standard is a very unlikely outcome any time soon.

V-cache was motivated from the server side, some very specific HPC/EDA workloads profit to the point, where a niche application was going to pay big enough to make it worthwhile.

And then they managed to make it rather cheap: I've seen estimates that the V-cache an extra €20 per CCD in manufacturing, not the markup they are charging for V-cache EPYCs, I reckon.

That they then allowed this bit of "premium tech" to be sold on the desktop at consumer prices was probably a tough decision to make, without Intel as the tough competitor it was at that point, it might have never come to pass.

AMD is into money making. V-cache as a standard would be a freebie, that just doesn't happen, unless it helps making money (e.g. open source software).

So even if the V-cache is even less effort to make with the new cache placement, foregoing €20 of manufacturing cost per CCD to make customers happy could stoke the ire of stockholders and AMD would reserve that weapon against competitors as long as it's exclusive.
 
Nov 7, 2024
1
0
10
Devastasting performance? U can always tell an intel fan wen hes pissed off. Bro this cpu is the best out there jus admit it and stop spreading fake news.
 

ilukey77

Reputable
Jan 30, 2021
808
332
5,290
Should the performance of the 9000x3d parts be reliable at the same settings as the non x3d processors over time would AMD consider making 3d cache standard?
They have experience over 3 generations of processors, they know its base characteristics and where the low hanging fruit is for improvements (get the clocks up)..
This is what i dont understand with AMD ( while many stated that the AM5 platform was the reason for poor sales of the early 7000 series )

I personally believe that anyone who knew / used the 5800x3d ( which i did and realized it was a monster in gaming ) was always going to buy the next gen x3d being the 7800x3d so i held off buying AM5 at release and ddr5 ..

I bought my Asus rog crosshair gene m-atx mobo and my ddr5 and sat on it for months only to buckle under pressure to build my system by buying a 7600x as a placeholder cpu while i waited for the 7800x3d !!

My point is AMD announced x3d was coming before the release of the 7000series so alot waited so why with these x3d CPU's basically owning everything in gaming ( sort 9950x3d for both ) and why even release the other rubbish ??

Im not sure what else you use CPU's for either gaming or production :) or both

If i was AMD id do away with all stock CPU's and just sell the better binned 8 core cache cpu's as the 9800x3d
the lesser quality as 9600x3d maybe drop a budget 9500 cpu but there line up should consist of a 9600x3d 9800x3d , 9900x3d ( if they sort the cache ) for the better production same gaming performance 9800x3d and the grand daddy 9950x3d for you ultimate gaming and production CPU !!
 
Mar 10, 2020
414
376
5,070
I'm sure they are considering all options.

I'm pretty sure that making it standard is a very unlikely outcome any time soon.

V-cache was motivated from the server side, some very specific HPC/EDA workloads profit to the point, where a niche application was going to pay big enough to make it worthwhile.

And then they managed to make it rather cheap: I've seen estimates that the V-cache an extra €20 per CCD in manufacturing, not the markup they are charging for V-cache EPYCs, I reckon.

That they then allowed this bit of "premium tech" to be sold on the desktop at consumer prices was probably a tough decision to make, without Intel as the tough competitor it was at that point, it might have never come to pass.

AMD is into money making. V-cache as a standard would be a freebie, that just doesn't happen, unless it helps making money (e.g. open source software).

So even if the V-cache is even less effort to make with the new cache placement, foregoing €20 of manufacturing cost per CCD to make customers happy could stoke the ire of stockholders and AMD would reserve that weapon against competitors as long as it's exclusive.
My feeling is that if people don’t buy or fewer people buy the none x3d parts then the impetus to produce x3d parts will be stronger.
At the same clocks x3d makes the non x3d parts semi redundant (price dependent).
It’s akin to Athlon vs Duron or Pentium vs Celeron.
 

ilukey77

Reputable
Jan 30, 2021
808
332
5,290
AMD is into money making. V-cache as a standard would be a freebie, that just doesn't happen, unless it helps making money (e.g. open source software).

So even if the V-cache is even less effort to make with the new cache placement, foregoing €20 of manufacturing cost per CCD to make customers happy could stoke the ire of stockholders and AMD would reserve that weapon against competitors as long as it's exclusive.
My feeling is that if people don’t buy or fewer people buy the none x3d parts then the impetus to produce x3d parts will be stronger.
At the same clocks x3d makes the non x3d parts semi redundant (price dependent).
It’s akin to Athlon vs Duron or Pentium vs Celeron.
I guess in my post
I bought my Asus rog crosshair gene m-atx mobo and my ddr5 and sat on it for months only to buckle under pressure to build my system by buying a 7600x as a placeholder cpu while i waited for the 7800x3d !
Then i see the incentive to sell the rubbish bit like Nvidia and there 3 -5 versions of their same GPU buy the rubbish to the fomo people to sell them the better stuff later on !!

while making $$$$