Review AMD Ryzen 7 9800X3D Review: Devastating Gaming Performance

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
So there are some games that are an exception to the rule like this game, cyberpunk, spiderman, et cetera. That's interesting. Though, not as relevant to me at 4k 240hz, but we shall see when I get a 5090 if I become CPU limited with my 5800X3D in some games.
Exception in what way? I think all games with RT behave similar (not to the same extent obviously), RT makes the game heavier for the CPU. Most people don't notice since it's also much heavier for the GPU as well - which is what will hit a bottleneck first in most cases unless you have a 4090 running at low resolution.

But for example, below is Cyberpunk in the heaviest area of the game.
View: https://www.youtube.com/watch?v=n_iIeNtN6yY


Without mega tuning the ram I am nowhere near 100 fps in this area. The above run is running tight 7200c32 ram manually tuned. If you have the game you can try that area with your 5800x 3d, with PT on it will be very heavy.

Keep in mind reviewers have every CPU hitting 150 to 200 fps in this game. That's because they don't use RT to stretch the CPU - and they are testing relatively light areas. That's why the 3d chips also appear 50% faster than everything else. According to HUB, 9800x 3d is 45% faster in this game compared to 12900k. Well, I'll test it next week, we shall see :cool:

EG1. Tom's hardware is an exception, they are testing a heavy area - don't know which one - but their framerates are low so..
 
Last edited:
  • Like
Reactions: helper800

Moobear

Reputable
Dec 9, 2019
11
6
4,515
I think a 30% fps increase team red over blue would make it so that you'd need some serious productivity reasons to even consider intel at this point. furthermore the 9950x exists... and that will out perform intel in gaming as well (not by nearly as much but the productivity will be on par)
agreed I got the 9800X3D a few days ago and it s pretty good for productivity I came from the 7800X3D but yeah feels snappy and for *regular/non gaming it s a step up also and it s much cooler at least mine is ( maybe good silicon? )
 

Ogotai

Reputable
Feb 2, 2021
391
247
5,060
RT makes the game heavier for the CPU
do you have any proof of this claim ? as i have called you out on this earlier in this thread, and you just dismissed me pretty much. so either post proof of this, or this is just false info, meant to mislead others...

That's why the 3d chips also appear 50% faster than everything else.
and yet, you are claiming to get one of these, even though all of your posts have praised intel across the board, and amd cant, and wont touch them based on power, efficiency, and performance... yea ok.. seems strange you would praise intel across the board, only to go against those praises, and buy AMD....
 
do you have any proof of this claim ? as i have called you out on this earlier in this thread, and you just dismissed me pretty much. so either post proof of this, or this is just false info, meant to mislead others...
No you didn't understand what was being said which is why you were dismissed. In some games the RT implementations require more CPU power than straight raster does. You cannot see this difference if you're GPU bound to the point that the CPU is waiting on the GPU.
 

Ogotai

Reputable
Feb 2, 2021
391
247
5,060
No you didn't understand what was being said which is why you were dismissed. In some games the RT implementations require more CPU power
the way he worded it, was RT uses CPU cores to render... hence why i asked, and to be frank, it still sounds like that is what he is saying, hence why i ask for a source to his claim, unless you can provide one ?

i havent read or heard any where that RT uses any cpu power, its all done on the GPU, which is why if you dont have a GPU that can do RT, you cant use RT....
 
the way he worded it, was RT uses CPU cores to render... hence why i asked, and to be frank, it still sounds like that is what he is saying, hence why i ask for a source to his claim, unless you can provide one ?

i havent read or heard any where that RT uses any cpu power, its all done on the GPU, which is why if you dont have a GPU that can do RT, you cant use RT....
No it was worded completely accurately you just lept to your own conclusion. Dragon Age: The Veilguard is the title in question and the evidence was already shown. There are very few places who do CPU benchmarks for new titles so I'm unaware of any in English. TPU's reviewer mentioned the game being CPU heavy, but they only did a GPU test and use a 14900K.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
do you have any proof of this claim ? as i have called you out on this earlier in this thread, and you just dismissed me pretty much. so either post proof of this, or this is just false info, meant to mislead others...
Are you okay? I've already posted plenty. You just don't get it, I dismissed you cause I feel like im talking to a brick wall.

and yet, you are claiming to get one of these, even though all of your posts have praised intel across the board, and amd cant, and wont touch them based on power, efficiency, and performance... yea ok.. seems strange you would praise intel across the board, only to go against those praises, and buy AMD....

I didn't expect you to be interested in what im buying but hey, thanks I guess. Im buying stuff I find interesting just to test them. I'm benching. I've bought almost every Ryzen generation bar zen 3 and all intel generations starting from coffeelake (8700, 10900k, 11600k, 12900k, 13900k, 14900k). I just like hardware, what's the issue?
 
Last edited:

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
and i am sure, some on here, feel the exact same about you..................

the fact, according to you, amd isnt worth the money, to test or not....
Bud, you can take your fanboying somewhere else my man, im not interested. You feel personally attacked when I say something bad about an amd cpu. Chill, im not insulting you.

Did I say anything bad about the 9800x 3d? It looks great according to reviews, so I got it to test.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
“BVH and BSP have been around for years, why is it suddenly so computer intensive?”

Any programmers care to explain why? Both are supposed to be efficient and accelerate the layouts and viewpoints (as I understand BSP) and they are related.
So I did some testing. Not that it stops the game from being CPU bound, but windows 10 runs the game much better on Intel. I tried clean installs of win 11 22h2 - 24h2 and win 10. Look at this, just the first 5 seconds are enough to see the difference. Something broke on Intel with win 11.

Win 10


Win 11

 
  • Like
Reactions: helper800

Elusive Ruse

Estimable
Nov 17, 2022
450
582
3,220
do you have any proof of this claim ? as i have called you out on this earlier in this thread, and you just dismissed me pretty much. so either post proof of this, or this is just false info, meant to mislead others...


and yet, you are claiming to get one of these, even though all of your posts have praised intel across the board, and amd cant, and wont touch them based on power, efficiency, and performance... yea ok.. seems strange you would praise intel across the board, only to go against those praises, and buy AMD....
You have been baited by the resident Intel apologists to hijack a thread about the 9800X3D blowing every Intel CPU out of the water to discuss an offtopic subject that distracts from this "devastating" loss.

Let me put us back on topic a tad, here is results of a game with proper RT implementation where TPU managed to showcase a scenario where GPU is not such a bottleneck and voila, the 9800X3D trumps all as expected.

cyberpunk-2077-rt-1280-720.png
 
  • Like
Reactions: YSCCC
Mar 10, 2020
414
376
5,070
So I did some testing. Not that it stops the game from being CPU bound, but windows 10 runs the game much better on Intel. I tried clean installs of win 11 22h2 - 24h2 and win 10. Look at this, just the first 5 seconds are enough to see the difference. Something broke on Intel with win 11.

Win 10


Win 11

That’s fine, it doesn’t explain why a method for minimising complexity is loading the cpu to the point where it’s tanking.

Bvh is a space division method for describing a 3d scene, this and Bsp have been around for many years. Bsp probably most famous for its use in Quake. Bvh improves on Bsp by reducing the memory footprint required for complex scenes where objects overlap.

Bvh scales depending on the quality required. A dynamic scene.. do a lower quality Bvh model. A more static scene, use a higher quality Bvh model. The former will be discarded more often, the latter will be relevant for longer.

My question “why does a method for optimising 3d model space make the cpu tank?” Still stands.

There is also a move afoot to move the Bvh generation to the gpu reducing the cpu load. Can any software people explain the details (preferably in English, high level, understandable terms).
 
Look at my game using the Software Renders (Quake 2 and 1, for example)! Intel with the higher core part beats the 8 core in embarrasingly parallel workloads! Shocker.

Also, Cryengine 3 with UE's Lumen Software do put extra load on the CPU, but you have to ask yourself: do we want that to be the case, really? See how that turned out for Cryengine 3. Lumen mostly uses SM5.0 to run the "non accelerated" part, but still puts a bit of extra load on CPU IIRC.

Regards.
 

Ogotai

Reputable
Feb 2, 2021
391
247
5,060
you can take your fanboying somewhere else my man
sorry, you 1st with your intel fan boying. look at all your posts claiming amd has the better efficiency compared to intel, when others show you are wrong.

You feel personally attacked when I say something bad about an amd cpu
no, you didnt say any thing bad about and amd cpu you said it to me :
im talking to a brick wall.
Did I say anything bad about the 9800x 3d?
no, but considering how much you have praised intel up till now, its.. surprising you arent bashing it in some way....
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
your posts claiming amd has the better efficiency compared to intel, when others show you are wrong.
Ok, im wrong, amd doesn't have better efficiency. Got me.

no, but considering how much you have praised intel up till now, its.. surprising you arent bashing it in some way....
My bad, I tend to praise good products regardless of the brand. I apologize.
 

Ogotai

Reputable
Feb 2, 2021
391
247
5,060
im wrong, amd doesn't have better efficiency. Got me.
actually, from what others post, that show you are wrong, amd does has the better efficiency, intel doesnt.

I tend to praise good products regardless of the brand.
in this cause, and from going from your praise of intel in all your other posts, im sure you will find something to bash the x3d chips with, after you jump off the x3d bandwagon 🤣🤣🤣🤣🤣
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
actually, from what others post, that show you are wrong, amd does has the better efficiency, intel doesnt.


in this cause, and from going from your praise of intel in all your other posts, im sure you will find something to bash the x3d chips with, after you jump off the x3d bandwagon 🤣🤣🤣🤣🤣
Ok bud
 

ilukey77

Reputable
Jan 30, 2021
808
332
5,290
As I've said in a different thread, the 3d chips are way ahead of everything else in lighter scenes with non CPU demanding settings. Once the ***t hits the fan, they faceplant straight into a wall.
Still based on cache sensitivity ..
What made the 13900k and to an extent the 14900k the king excluding production was on non cache sensitive game it beat the 7800x3d

Down side now is ( and we need intel ) a lot of games a leveraging cache so the x3d cpus are crushing ..

Intel need 3d cache and fast before AMD become the next Nvidia!!
( which the current price hike of the 9800x3d is looking like that )
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
Still based on cache sensitivity ..
What made the 13900k and to an extent the 14900k the king excluding production was on non cache sensitive game it beat the 7800x3d

Down side now is ( and we need intel ) a lot of games a leveraging cache so the x3d cpus are crushing ..

Intel need 3d cache and fast before AMD become the next Nvidia!!
( which the current price hike of the 9800x3d is looking like that )
It's not about game being sensitive to cache. All games are sensitive to cache. I think it's about cache missrate. Intel has a better memory subsystem so in games that cache misses a lot of predictions, intel can go to memory faster. This is also why tuning memory on intel makes them super fast.

3d chips don't care about slow memory that much, but they also don't scale as well with it.
 
Nov 11, 2024
1
0
10
AMD’s $480 Ryzen 7 9800X3D comes armed with eight cores and 16 threads paired with a new version of the company’s game-boosting 3D V-Cache tech that delivers impressive performance, taking the throne as the fastest gaming CPU on the market.

AMD Ryzen 7 9800X3D Review: Devastating Gaming Performance : Read more
This is probably one of the most ignorant fluff articles around. Oh, yeah. Cores and clock speeds. Completely ignores all the extra pcie lanes to get the data from an M2 drive loaded and to where it can be used... So much was missed here...
 
This is probably one of the most ignorant fluff articles around. Oh, yeah. Cores and clock speeds. Completely ignores all the extra pcie lanes to get the data from an M2 drive loaded and to where it can be used... So much was missed here...
You do realize most AMD motherboards already support PCIe5 speeds on their NVMe drives, right? That is the same as Intel. More or less lanes have nothing to do, unless you're talking about Z890 having just more lanes in general? To which, that's a plaatform evaluation and not a CPU one.

For the average consumer, the fact Z890 has more I/O capabilities won't be of practical real world use, outside of Thunderbolt.

Regards.
 
  • Like
Reactions: Lucky_SLS
Exception in what way? I think all games with RT behave similar (not to the same extent obviously), RT makes the game heavier for the CPU. Most people don't notice since it's also much heavier for the GPU as well - which is what will hit a bottleneck first in most cases unless you have a 4090 running at low resolution.

But for example, below is Cyberpunk in the heaviest area of the game.
View: https://www.youtube.com/watch?v=n_iIeNtN6yY


Without mega tuning the ram I am nowhere near 100 fps in this area. The above run is running tight 7200c32 ram manually tuned. If you have the game you can try that area with your 5800x 3d, with PT on it will be very heavy.

Keep in mind reviewers have every CPU hitting 150 to 200 fps in this game. That's because they don't use RT to stretch the CPU - and they are testing relatively light areas. That's why the 3d chips also appear 50% faster than everything else. According to HUB, 9800x 3d is 45% faster in this game compared to 12900k. Well, I'll test it next week, we shall see :cool:

EG1. Tom's hardware is an exception, they are testing a heavy area - don't know which one - but their framerates are low so..
Most places use the built-in benchmark for CP77, because it's easier, and the FPS is definitely higher — more GPU limited, less CPU limited. We are using a manual run through a portion of Night City right outside one of your apartments, the number 10 building I think? And we run across a crowded street toward Tom's Diner (no relation... LOL)

But it is still very much a GPU limited game, just like Dragon Age. What you're showing proves that DAVG with RT performance drops a ton, and that similarly the gap between various CPUs narrows a lot. That is the opposite of a CPU limited game. It is in fact the very definition of a GPU limited game. Why is your CPU maxed out, then? Probably because the programming behind DAVG with RT is less than optimal would be my guess.

FWIW, I'm seeing 84% GPU utilization on a 4090 at 1080p medium (no upscaling), 89% at 1080p ultra (without the "selective" RT effects), and 98% at 1440p ultra. The first of those indicates at least a modest CPU bottleneck — and not surprisingly, I see the 4090 and 4080 Super only separated by 5%. At 1080p ultra, the separation between those two GPUs grows to 10%, so still very CPU limited. 1440p ultra, there's a 19% gap, and 4K ultra gives a 25% gap. 4K ultra with RT maxed out, the 4090 ends up 28% faster. That's basically GPU limited. The 4080 Super is 92% utilized at 1080p medium, and 97% at 1080p ultra, with 99% at everything above that.

To prove DAVG is not GPU limited, as others noted, would require testing with a slower GPU like a 4080 and showing similar numbers to the 4090, across a variety of CPUs.

Incidentally, one of the things that can make RT games CPU heavy (heavier at least) is if they do more complex BVH builds on the CPU using a single thread (or maybe two?), which is what Diablo IV does. Diablo IV has a terrible RT implementation, though, and no one should look to it as an example of how to do RT properly.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
Most places use the built-in benchmark for CP77, because it's easier, and the FPS is definitely higher — more GPU limited, less CPU limited. We are using a manual run through a portion of Night City right outside one of your apartments, the number 10 building I think? And we run across a crowded street toward Tom's Diner (no relation... LOL)

But it is still very much a GPU limited game, just like Dragon Age. What you're showing proves that DAVG with RT performance drops a ton, and that similarly the gap between various CPUs narrows a lot. That is the opposite of a CPU limited game. It is in fact the very definition of a GPU limited game. Why is your CPU maxed out, then? Probably because the programming behind DAVG with RT is less than optimal would be my guess.

FWIW, I'm seeing 84% GPU utilization on a 4090 at 1080p medium (no upscaling), 89% at 1080p ultra (without the "selective" RT effects), and 98% at 1440p ultra. The first of those indicates at least a modest CPU bottleneck — and not surprisingly, I see the 4090 and 4080 Super only separated by 5%. At 1080p ultra, the separation between those two GPUs grows to 10%, so still very CPU limited. 1440p ultra, there's a 19% gap, and 4K ultra gives a 25% gap. 4K ultra with RT maxed out, the 4090 ends up 28% faster. That's basically GPU limited. The 4080 Super is 92% utilized at 1080p medium, and 97% at 1080p ultra, with 99% at everything above that.

To prove DAVG is not GPU limited, as others noted, would require testing with a slower GPU like a 4080 and showing similar numbers to the 4090, across a variety of CPUs.

Incidentally, one of the things that can make RT games CPU heavy (heavier at least) is if they do more complex BVH builds on the CPU using a single thread (or maybe two?), which is what Diablo IV does. Diablo IV has a terrible RT implementation, though, and no one should look to it as an example of how to do RT properly.
I already said that you - unlike other reviews - are not using super light scenes in cyberpunk.

I don't think the gap between cpus narrows, at least not percentage wise. I'll know next week when I receive my 9800x 3d, I expect it to be a good 15-20% faster than 12900k in that area of cyberpunk. In veilguard, it won't surprise me if it's actually slower 😁
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
Most places use the built-in benchmark for CP77, because it's easier, and the FPS is definitely higher — more GPU limited, less CPU limited. We are using a manual run through a portion of Night City right outside one of your apartments, the number 10 building I think? And we run across a crowded street toward Tom's Diner (no relation... LOL)

But it is still very much a GPU limited game, just like Dragon Age. What you're showing proves that DAVG with RT performance drops a ton, and that similarly the gap between various CPUs narrows a lot. That is the opposite of a CPU limited game. It is in fact the very definition of a GPU limited game. Why is your CPU maxed out, then? Probably because the programming behind DAVG with RT is less than optimal would be my guess.

FWIW, I'm seeing 84% GPU utilization on a 4090 at 1080p medium (no upscaling), 89% at 1080p ultra (without the "selective" RT effects), and 98% at 1440p ultra. The first of those indicates at least a modest CPU bottleneck — and not surprisingly, I see the 4090 and 4080 Super only separated by 5%. At 1080p ultra, the separation between those two GPUs grows to 10%, so still very CPU limited. 1440p ultra, there's a 19% gap, and 4K ultra gives a 25% gap. 4K ultra with RT maxed out, the 4090 ends up 28% faster. That's basically GPU limited. The 4080 Super is 92% utilized at 1080p medium, and 97% at 1080p ultra, with 99% at everything above that.

To prove DAVG is not GPU limited, as others noted, would require testing with a slower GPU like a 4080 and showing similar numbers to the 4090, across a variety of CPUs.

Incidentally, one of the things that can make RT games CPU heavy (heavier at least) is if they do more complex BVH builds on the CPU using a single thread (or maybe two?), which is what Diablo IV does. Diablo IV has a terrible RT implementation, though, and no one should look to it as an example of how to do RT properly.
Completely irrelevant (somewhat I guess) it would be nice if you could share the save files from your tests like PCGH does. Cause testing the same area isn't the same, your save file may behave differently / different time of day / npcs etc. It's just a suggestion for future reviews.
 
  • Like
Reactions: helper800