News AMD Ryzen 7 5800X3D Review: 3D V-Cache Powers a New Gaming Champion

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Apr 14, 2022
3
1
15
My counter arguments to this are:
  • This example has too small of a sample size to be useful. I'm nit picking here sure, but if the upwards spike was intermittent, then it doesn't matter over the long run.
    • Consider this, the average benchmark tends to be 60 seconds. If the performance average is 100 FPS, that's a sample size of 6000 frames. Even if we had a case where one second was 200 FPS, the overall FPS would only increase by 1.666...
  • Unless there's a blip of looking at an empty skybox, most games won't exhibit a behavior of suddenly shooting up in FPS. Also I can't imagine a scenario where one CPU would suddenly have a blip and another wouldn't.
  • Practically all benchmarks report an average, which is the number most people will use because it's right there. If you have a problem with that, then go tell benchmark developers to stop doing this.
However, I will say that the data set would be better if they added a frame time graph.


Textures don't reside in CPU cache. Also calibrating to some arbitrary FPS and seeing the quality settings you can get is not really a useful metric when benchmarking the processor. The goal is to see how much performance you can get out of the processor period, not a combination of performance and image quality.

As an example, if I'm getting 100 FPS, I've identified it's my CPU limiting performance, and I want to know which CPU gets me say 240 FPS on a game (because I happen to own a 240 Hz monitor), if everything is "calibrated" to 144, then how do I know which CPU to get?


They're using a geometric mean for the specific purpose of lessening the effect of those outliers. From https://sciencing.com/differences-arithmetic-geometric-mean-6009565.html:

Previous post already too long , I can't quote everything. so I just put my point one by one here.

The only reason I put few suggestion on how to benchmark properly just b/c there are no so large L3 cache before, but since we have 3800 3D here, we DO need change the way we bench it to reflect CPU's real performance instead of just L3 cache performance.



1. For the average FPS or 99th FPS.

for example, in Tom's test here,

vP7sBHWbbvGRRBAMaoWHCH-970-80.png.webp



since 5800 3D have large L3 cache, it's peak FPS is much higher than 99th FPS. that's b/c before L3 cache full, FPS gain alot, but when L3 reloading and miss hit the cache line, the FPS also lower alot. that's why 5800 3D have low 99th FPS and higher average FPS. but from player's opnion, 99th FPS is real one and peak FPS too high not a good thing cause it's not stable and fluctuate too much expecially for FPS shooter gamer, they hate jitters.

2. For game graphic setting and calibration to 144Hz, my explaination is here:

The gaming texture loading path is Disk texture File -> cpu -> ram -> cpu -> pci-e to GPU ram -> CU's unified cache ( L1 ).
Before we have amd 5800 3D, all cpu 's bench is testing how fast the cpu can process the rendering request from DX api. so unlimited FPS make sense and it's a fair game.
But since 5800 3D have extra L3 cache, things changed alot, unlimited FPS is not only testing how fast CPU can process rendering request from API, 5800 also cache texture from the loading path mentioned above , so if still unlimited FPS, then it will need test all game graph settings instead just testing one like Toms's did here, which won't show when in extra texture model, ( extreme graph level), how CPU really performance. And gamer still have no idea which CPU is best for their gaming experience with their setttings. Like here, even after I read Toms's bench, I still don't know if I play 4K with extreme @ 144 Hz 32" monitor , which CPU I should pick. The only reason is the differrent graphic setting 's FPS is not linear, if L3 cache size overflow, 5800 3D's FPS will slower than 12900k, but if texture is small and all L3 cache hit, 5800 3D FPS gain alot.

3. How to synthesis the multi game result.
If Tom's test already arithmetic mean, that's great, but ppl buy new GPU for new game. all tested game as I know , it's multi year old one and NONE of them from top 10 steam. like they don't include eldern ring , no new world etc. Old game and new game really performance totally difference in favor of CPUs, My story is when I pick my CPU for new world which is based on Tom's testing , which is amd 5600x, but after I have a intel 11700K, I found 11700k did 30% high FPS vs amd 5600x.
So please update Tom's benchmark games, these games tested really too old to have players nowadays.
Another thoughts about the multi game result, even it's already arithmetic mean, could it still add weights on games? Like there are 1M gamer play eldern ring , but only 1K gamer play watch dogs, give these two game same weights when do mean FPS is so misleading gamers.!
 
It's not on par in price with the standard 5800x, and that is a critical distinction, because the 5900x is currently selling for $71 less than the MSRP of the 5800X3D. You really need to be sure gaming at lower resolutions is all you care about if you jump for the X3D.
You're missing the context and nuisance of the rest of the post. If you're looking at the 5800X3D only for games, you still won't miss much of the rest of the performance due to the lower clocks compared to the 5800X/5700X, even using PBO.

You're barking at the wrong tree, as I went with a 5900X and not the 5800X3D for my main rig*, but I do have a 5600X in my HTPC and the 5800X3D makes perfect sense there.

Also, the comparison is against the 12900K/S because they are just faster for games, like it or not. There's a reason Intel is not making 12600KS'es. Still, I think the comparison is not moot and trying to push the 12600K or 12700K as hard as possible to see how they fare with both the 5800X3D and the 12900KS is a worthwhile exercise. One that I'm pretty sure will end up shocking people xD

Regards.
 

ConfusedCounsel

Prominent
Jun 10, 2021
91
49
560
Looking at the chip comparisons from the traditional MS Office 365 PC user wanting to game after work or completing their homework - the average home user - Gaming is the only CPU intensive workload their computers see. Accordingly, this chip makes sense to the vast majority of users. I see a larger market segment for this chip than the 12900K. Look at it this way, the Corvette is faster than the Honda Civic, yet the Civic is the most commonly sold car in America.
 

jacob249358

Commendable
Sep 8, 2021
636
215
1,290
Don't point me to entertainers like LTT, I watch people that do real reviews and rip companies one off when they make mistakes instead of lauding them, as in GN and HUB.
I suggest you watch those, if you want professional testing.
I watched all three. And they all gave similar results. I wasn't saying to watch ltt in a mean way like ¨go get better information looser¨ or something like that, I was just saying Anthony had some good points in the video.
 

Phaaze88

Titan
Ambassador
For those already on AM4, it's ok, but I feel that...
Content creators and pros, where Threadripper isn't a necessity or out of budget: There's already the 5900X for cheaper.
For gaming only: When it wins, it bloody WINS. When it doesn't, most of the time it's like a slightly bigger fart over the original.
All purpose: There's the 5700X and 5600X. I guess if one just wants to throw a little more money at it, because they can - or it's on sale - there's the O-G 5800X.

For those not on AM4...
Content creators and pros: There's 12th gen i7 and i9(with it's caveat).
Gaming only: 12th gen i5 or better. I imagine a few people will specifically hunt down the 5800X 3D anyway.
All purpose: Put a 12400 on it, or a 12600K, if one wants some E-cores.


Being able to upgrade a cpu on the same socket is irrelevant for me, as my upgrade intervals would warrant a completely different platform anyway. I prefer an already mature platform because of this, and AM4 is there, so that's a plus for the 5800X 3D.
I'm neutral towards the gaming focus of this cpu, since I fall in the all purpose crowd.
It wins some, and loses some... bland/10.
 
Its a good chip for the price but this is like if someone body slams you (alder lake) and then you kick them in the nuts while you're on the ground. This is just a last ditch effort for amd to maintain the gaming crown before moving to zen 4. Am4 is a dead platform so no one will be suggesting this in a new build. I think this is for upgraders. This whole 3d cache needs some rethinking because the thermal limits means it can only go so far. I suggest watching the video from ltt. It was impressive but only in games that can take advantage of the cache. The cheaper 12700k is a better option all around.
That's true, but then you might as well go with the i5. There are better value options, but there lower cost parts will for the most part always be better value than higher tier products. However I agree with you and think the main point of this chip is its the last hurrah for AM4. It's never going to be a better option than most things on lga1700 (alder-lake or upcoming raptor lake) or upcoming AM5 cpus, but if you don't want to replace your motherboard + RAM, then for existing AM4 users, this is the best option (for gaming).
 
  • Like
Reactions: jacob249358

ConfusedCounsel

Prominent
Jun 10, 2021
91
49
560
I am curious if I am the minority, but I only use my PC for office work and gaming. Office 365 doesn't take much to run. So, do the productivity benchmarks mean much to the average gamer?

Not complaining they are there, as I know they are useful for some, I am just curious how many others only use their PC for Office 365, writing email, Zoom /Team calls, running turbo tax every April, and gaming.
 

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
I am baffled at the stupidity of people or intentional denigration of AMD, take your pick, which ever is the case, about this CPU.

AMD said themselves this is for GAMERS, yet I see many that do a lot more than just gaming (or no gaming at all) complaining about it's performance in non-gaming scenarios....

Yeah, that's where we are at the point when people don't even know how to read and then they complain just for sake of complaining. Pathetic.

Compared to every other intel CPU over $450 is has a better value and even better performance (in lots of cases) as a GAMING CPU. Simple as that and nothing more.

As for new PC builds, I would not build a new PC now when in 3-4 months we have both intel and AMD new generations coming. Like I would not buy a "new" GPU now when next gen GPUs are also coming so soon with 2x performance, even if the current ones hit MSRP.

Alder Lake makes no sense to build a new PC now (unless your old one died) when next gen is so close and 5800X3D is best for GAMERS only, who don't want to wait for AM5 and Zen4.
 
  • Like
Reactions: KananX
I am baffled at the stupidity of people or intentional denigration of AMD, take your pick, which ever is the case, about this CPU.

AMD said themselves this is for GAMERS, yet I see many that do a lot more than just gaming (or no gaming at all) complaining about it's performance in non-gaming scenarios....

Yeah, that's where we are at the point when people don't even know how to read and then they complain just for sake of complaining. Pathetic.
Yeah and intel tells people that they should run their CPUs at PL2 for tau seconds and then return to PL1, how stupid are the people that don't understand that?!

That's what normal people call a discussion, it doesn't matter what the company says, people are going to discuss every aspect of the CPU and not only what the company would like them to discuss.
Pathetic in deed!
 
  • Like
Reactions: KyaraM
if you refer to the reviews of the broken amd chips from both the xbone and ps5 you see the unified memory isnt quite as good as could be expected do to latency
But the broken ones don't have the iGPU, or am I wrong on that?!
The unified ram works well on consoles because you only have to copy the data once and it's available to both the CPU and the GPU, without the GPU the unified ram is pretty much useless.
 
Apr 14, 2022
3
1
15
5800 3D was NOT the CPU for gamers and Tom's test is absolutely misleading to readers.

After I read all the feedback from this post, and re-thinking about Tom's traditional testing methodology, I have to point this out.
Why is that? b/c this testing is so outdated and it never shows the real requirement from gamers nowadays.

what do gamers do in the 2020s?

They play MMORPG or FPS e-sports!
They play with their teammates with discord voice communication!
They record their game for after-match analysis or show it on youtube!
They twitch their skills during gaming for the whole community!
They have chrome open for post screenshots and reading game guidance.

There is a fewer solo gamer who only run games on their PC anymore, they run everything by side of the game!

Why Tom's ways of testing are so misleading b/c they are not showing the reader the real performance when you game!

64Mb L3 cache helps a lot when you only run the game app but helps less when you are multi-tasking doing your real game things.

When you gaming with recording, streaming twitch, reading the browser's guidance, watch youtube's skill show, the 64Mb L3 cache's existence IS no existence. b/c L3 cache just always overflow forever!

And you can't trust this testing at all, b/c it NEVER includes any scenarios above!

Using traditional benchmark methodology for 5800 3D is absolutely wrong! b/c 64Mb L3 extra cache only cheating in a solo gaming app, not the real thing when you gaming.!

This is like a well-markuped girl with extra layers of face powders that make you feel you found miss right, but when you bring her to the beach, to the sports court, to the bathroom with follow-up actions, you will be DISAPPOINTED!
 

Specter0420

Distinguished
Apr 8, 2010
114
35
18,710
No DCS VR? Microsoft Flight Simulator VR wasn't even tested... Tom's has really gone downhill over the years.
Oh look, you get 343FPS is some game, how useful...
Meanwhile the best you can currently get and overclock is still limited by single core performance in VR flight simulators like MSFS and DCS. We struggle to maintain 60 FPS with our high CPU frame times. Just create a mission in DCS with a bunch of moving AI units while autopilot flies over a city. It'll be highly repeatable for accurate benchmarking. Do the same using one of the landing challenges in MSFS...
 
  • Like
Reactions: dbgk
Apr 15, 2022
1
3
15
In techspot's review, in 4 out of the 8 games (Cyberpunk 2077, Watchdogs Legion, Riftbreaker, Hitman), the new Ryzen chip does get beat handily by Alder Lake with fast DDR5...

This not to denounce its performance - it can still claim the best performance per watt, per dollar and on DDR4 - but the pure performance crown it hasn't got.

Was also disappointing to see regressions bigger than expected on the productivity suite TBH. Was looking what to replace my 3700x with, but with the recent price drops, the 5950x seems more enticing/well rounded - especially if you don't have a 3090TI hooked to a 1080p screen.
 
Last edited:

logainofhades

Titan
Moderator
5800 3D was NOT the CPU for gamers and Tom's test is absolutely misleading to readers.

After I read all the feedback from this post, and re-thinking about Tom's traditional testing methodology, I have to point this out.
Why is that? b/c this testing is so outdated and it never shows the real requirement from gamers nowadays.

what do gamers do in the 2020s?

They play MMORPG or FPS e-sports!
They play with their teammates with discord voice communication!
They record their game for after-match analysis or show it on youtube!
They twitch their skills during gaming for the whole community!
They have chrome open for post screenshots and reading game guidance.

There is a fewer solo gamer who only run games on their PC anymore, they run everything by side of the game!

Why Tom's ways of testing are so misleading b/c they are not showing the reader the real performance when you game!

64Mb L3 cache helps a lot when you only run the game app but helps less when you are multi-tasking doing your real game things.

When you gaming with recording, streaming twitch, reading the browser's guidance, watch youtube's skill show, the 64Mb L3 cache's existence IS no existence. b/c L3 cache just always overflow forever!

And you can't trust this testing at all, b/c it NEVER includes any scenarios above!

Using traditional benchmark methodology for 5800 3D is absolutely wrong! b/c 64Mb L3 extra cache only cheating in a solo gaming app, not the real thing when you gaming.!

This is like a well-markuped girl with extra layers of face powders that make you feel you found miss right, but when you bring her to the beach, to the sports court, to the bathroom with follow-up actions, you will be DISAPPOINTED!

MMORPG and FPS e-sports tend to be poorly threaded. There is something wrong with your system, if it cannot handle such games, with an 8c/16t chip, and doing other tasks at the same time. I play WoW, and sometimes FFXIV, with a regular 5800x, and experience 0 slowdowns, system wise. I run discord, as I am in a guild, I play youtube videos, or other video streaming platforms, and often hit up WoWhead and raidbots for information. My only issues, are game engine related, due to WoW being so old, and not the greatest on multithread.
 
  • Like
Reactions: King_V
MMORPG and FPS e-sports tend to be poorly threaded. There is something wrong with your system, if it cannot handle such games, with an 8c/16t chip, and doing other tasks at the same time. I play WoW, and sometimes FFXIV, with a regular 5800x, and experience 0 slowdowns, system wise. I run discord, as I am in a guild, I play youtube videos, or other video streaming platforms, and often hit up WoWhead and raidbots for information. My only issues, are game engine related, due to WoW being so old, and not the greatest on multithread.
What he's asking is valid, it's not about the cores being able to handle it, it's if the performance benefit the cache provides stays the same if there are multiple programs hitting the cache at the same time, and especially if they exceed the amount of available cache forcing the data in it to be swapped in and out.
If you have to play the games alone with nothing else running it loses much of the value proposition.
 

logainofhades

Titan
Moderator
You do not have to play such titles, with nothing else running. Said titles are not that demanding, with regards to core utilization. Another friend of mine, does the same things I do, on an R5 3600, with 0 complaints. Said titles are not going to all the sudden run worse, and force you to play only games alone, just because said CPU has more cache.
 
You do not have to play such titles, with nothing else running. Said titles are not that demanding, with regards to core utilization. Another friend of mine, does the same things I do, on an R5 3600, with 0 complaints. Said titles are not going to all the sudden run worse, and force you to play only games alone, just because said CPU has more cache.
It's the other way around, when the cache only has to cater for the game you get a lot of performance, if the cache gets split between multiple things there might not be enough of it for the game to get the same amount of performance anymore, or there might be too much swapping around the data for it to provide the same performance.
 
  • Like
Reactions: KyaraM

emitfudd

Distinguished
Apr 9, 2017
571
82
18,990
Why didn't they use this technology on the 5900X? Kinda makes me mad I just built my computer in the end of 2021. A lot of people steered me towards Ryzen while I was contemplating Intel. I bought a 5900X which was the gaming king and then comes the Alder Lake chips. Now a 5800X is better than the 5900X. I wouldn't mind a new build being obsolete in a few years but not in a few months.
 

jacob249358

Commendable
Sep 8, 2021
636
215
1,290
Why didn't they use this technology on the 5900X? Kinda makes me mad I just built my computer in the end of 2021. A lot of people steered me towards Ryzen while I was contemplating Intel. I bought a 5900X which was the gaming king and then comes the Alder Lake chips. Now a 5800X is better than the 5900X. I wouldn't mind a new build being obsolete in a few years but not in a few months.
The 5900x is nowhere near obsolete. It's a still a total beast and will be great for a few years. Unless you are at 1080p or playing mcfs the 5900x wont be your system limit.
 
  • Like
Reactions: KyaraM

logainofhades

Titan
Moderator
Why didn't they use this technology on the 5900X? Kinda makes me mad I just built my computer in the end of 2021. A lot of people steered me towards Ryzen while I was contemplating Intel. I bought a 5900X which was the gaming king and then comes the Alder Lake chips. Now a 5800X is better than the 5900X. I wouldn't mind a new build being obsolete in a few years but not in a few months.

Just because something new and fancy comes out, doesn't make your system obsolete.
 
I think folks are getting hung up in the details without seeing a bigger picture. This is a very interesting use of a newer technology, stacking ridiculously high speed memory directly onto a CPU die as a way to expand it's capacity. I can see this start to be used in other applications where adding a large chunk of low latency LLC would result in large performance benefits. GPU's come to mind as those are heavily limited by memory bandwidth and having a large buffer cloud alleviate bottlenecks.

As for the CPU, it's a cheaper gaming CPU that also works well at general workloads. No normal person is counting the seconds to open an Excel file or send an email, most desktop application benchmarks are almost as bad as synthetics in this regard. Good for highlighting a strength or weakness in a design, terrible at communicating day to day experience. I can see this type of technology being more common in the future generations of processors.
 
Another interesting cpu tech.

Yet unless the AM4 user is rocking a RTX 3090/TI (or similar tier Radeon RX gpu), I don't really see a point in spend money on such a product, just like the 12900K and KS for gaming.

The 12700K should be plenty enough for most people.

But hey thats my view, if you have the money and wana have the newest thing, go ahead!

(That extra cache its probably runing pretty hot, too bad theres no easy way to measure it, right?)
 

pdegan2814

Distinguished
May 29, 2014
20
17
18,515
That was kind of underwhelming: some decent gains in gaming, almost no difference in anything else. At least the 2ns (20%) worse average cache latency isn't hurting anything particularly badly.

Underwhelming because it didn't show performance improvements in the things it wasn't trying to improve performance in? That's an odd takeaway.
 

JamesJones44

Reputable
Jan 22, 2021
856
790
5,760
if you refer to the reviews of the broken amd chips from both the xbone and ps5 you see the unified memory isnt quite as good as could be expected do to latency

Latency is always an issue with larger memory structures. This is no different that V-Cache which also has higher latency, but for games/FPS that usually isn't a significant enough hit to outweigh the advantage of being able to hold more objects in memory or remove the need to transfer them over an external bus.