News AMD Ryzen 9 7950X3D vs Intel Core i9-13900K Faceoff

Status
Not open for further replies.
don't trust limited tests for these.

look for reviews/review videos for specific things related to your needs.


When the cache matters it is very good.
When it doesnt though tis w/e.


Personally I would use a 3D cache chip just cause for me its simple as losing maybe 15% when doesnt matter, but when it does matter gain up to 25%.

got more to gain than to lose on average.
 

aberkae

Distinguished
Oct 30, 2009
99
28
18,560
Paper launch vs a product you can actually purchase though. It's been 18 days post launch and no restock from initial batch. Update my local microcenter still has 18 units of the 7900x3d out of 24 units from launch in stock fyi.
 
  • Like
Reactions: rtoaht

rluker5

Distinguished
Jun 23, 2014
598
366
19,260
There is still the issue of XMP vs tuned memory. Because the X3D is less sensitive to low latency memory doing this for both would favor Intel. Not as much as game selection, or maybe disabling core isolation (optional security feature so can't really argue against leaving it enabled), and likely not enough to have the 13900k beat the X3D in cache sensitive games, but it does seem on par with all of the Intel overclocking done in this review combined and is likely additive in it's results.
Here's a video that just came out on that: RAM/Memory Tuning & Scaling: Intel 13th gen Core Series [Raptor Lake] - YouTube
And this video clearly uses Hynix ICs, but the tuning seems reasonable for those sticks that use Hynix.
 

atomicWAR

Glorious
Ambassador
There is still the issue of XMP vs tuned memory. Because the X3D is less sensitive to low latency memory doing this for both would favor Intel. Not as much as game selection, or maybe disabling core isolation (optional security feature so can't really argue against leaving it enabled), and likely not enough to have the 13900k beat the X3D in cache sensitive games, but it does seem on par with all of the Intel overclocking done in this review combined and is likely additive in it's results.
Here's a video that just came out on that: RAM/Memory Tuning & Scaling: Intel 13th gen Core Series [Raptor Lake] - YouTube
And this video clearly uses Hynix ICs, but the tuning seems reasonable for those sticks that use Hynix.

The 7950x3D actaully scales decently with ram speeds and latencies depending on the title. Yeah its not as steep an improvement compared to non x3d chips but its still noticable and arguably worth while. Yeah some games like Horizon Zero Dawn are mostly flat in gains (ie very little to be had) others like Watch Dogs Legions the gains are pretty linear (7700x gains 33 fps where a 7950x3d gained 23fps going from ddr5 4800 cl 40 to ddr5 6000 cl 30)

https://www.techspot.com/review/2635-ryzen-7950x3d-memory-scaling/#2023-02-28-image

But yeah much like the vcache.. your mileage my vary by title. But if it were me, I'd still consider DDR5 6000 ram cl 30 with tight sub timings with x3d chip. Yeah overall increase in speed over overall in their test suite is only 4% but Intel's increase is only 3%. And neither as as starved as the 7700x with 17% but if your getting a high end platform why cripple it even if only by a few %.
 
Last edited:
  • Like
Reactions: rluker5

rluker5

Distinguished
Jun 23, 2014
598
366
19,260
The 7950x3D actaully scales decently with ram speeds and latencies depending on the title. Yeah its not as steep an improvement compared to non x3d chips but its still noticable and arguably worth while. Yeah some games like Horizon Zero Dawn are mostly flat in gains (ie very little to be had) others like Watch Dogs Legions the gains are pretty linear (7700x gains 33 fps where a 7950x3d gained 23fps going from ddr5 4800 cl 40 to ddr5 6000 cl 30)

https://www.techspot.com/review/2635-ryzen-7950x3d-memory-scaling/#2023-02-28-image

But yeah much like the vcache.. your mileage my vary by title. But if it were me, I'd still consider DDR5 6000 ram cl 30 with tight sub timings with x3d chip. Yeah overall increase in speed over overall in their test suite is only 4% but Intel's increase is only 3%. And neither as as starved as the 7700x with 17% but if your getting a high end platform why cripple it even if only by a few %.
Just going by the average framerates from the 7 game average at 1080p from that article, I saw 12% increase on Intel vs a bit below 7% for the X3D. Mind you this was also with faster ram for Intel, but that is a reality in today's market - Intel can and does use faster ram. And this also doesn't include tuned timings like I mentioned.
So that article supports my point. I never said X3D would have no improvement, just less.
 
  • Like
Reactions: atomicWAR

Endymio

Reputable
BANNED
Aug 3, 2020
725
264
5,270
Below you can see the geometric mean of our gaming tests ...
Why a geometric mean? Your base values are frame rates, no? Generally speaking:

Normal data: arithmetic mean
Rates: harmonic mean.
Ratios/percentages/scaled values: geometric mean.
 

atomicWAR

Glorious
Ambassador
Just going by the average framerates from the 7 game average at 1080p from that article, I saw 12% increase on Intel vs a bit below 7% for the X3D. Mind you this was also with faster ram for Intel, but that is a reality in today's market - Intel can and does use faster ram. And this also doesn't include tuned timings like I mentioned.
So that article supports my point. I never said X3D would have no improvement, just less.

It supports both our points but yes the improvement is indeed smaller no question you are certainly not wrong there. Its just the performance bump is big enough I wouldn't ignore it on the high end. Unless your running a full 128 GB of ram and forced into slower speeds due to the current IMC limitations on Ryzen 7000 it just seems unwise if your building a high end rig to leave that extra performance on the table. I think if you spending 699 on a CPU you'll be spending a grand or more on a GPU... and a large amount on ram as well. So why cripple yourself...

Now going to a 7800X3D (when it launches) where budgets become more realistic...then by all means cheap out on the ram and put that money towards more GPU power. Bang for your buck will be much better. So that would make the most sense if going AMD on a budget. 7950X3D buyers...budgets are likely going to be more flexible and when purchasing ram I see little reason to not get a decent cl 30 ddr5 6000mhz kit.

Edit: I saw the upvote after posting, thanks...My post is a little pointless now but my comments on the 7800X3d are new so i'll be lazy and leave the post with an edit! Thanks for taking the time to read, respond and like my post(s).
 
Last edited:

ottonis

Reputable
Jun 10, 2020
166
133
4,760
Thanks for the thorough comparative testing!
While being the new king in gaming CPUs may be the essence of all those tests, I fail to see the relevance.

Back in the day, a new CPU - e.g. a Pentium 120 vs a Pentium 60 - would have made the difference between a game being playable at 40 fps and non-playable at 18 fps.

Nowadays, where are those popular games that can be played with the new 3D-cashed CPU but not with a standard CPU?
I only see games at 350-ish fps vs 299-ish fps etc - a difference in fps that has zero relevance for the gaming experience whatsoever.

So, the new 3D chips make only sense if they allow for a perceivable improvement in gaming experience - something AMD should have in mind if they want to rule the gaming market segment.

They would really need to collaborate with a few of the big gaming studios and publishers and ask them to create enhanced versions of existing blockbuster games optimized for their 3D cash CPUs.
Such optimizations would tremendously beef up physics or visual effects or AI when played on a large cash CPU and thus be a real incentive for all those pondering whether or not to get one of these new gaming CPUs.

Back in the day, "unplayable" games were the main reason to upgrade CPU or graphics card, or both.
 
Last edited:

btmedic04

Distinguished
Mar 12, 2015
472
361
19,190
A $37 difference between a 32gb kit of ddr4 3200 and a 32gb kit of ddr5 5200 (official ram speeds supported by raptor lake and zen 4) is negligible at the price points of these cpus. your argument of ddr5 pricing being a negative no longer holds water, especially at this performance class


newegg ddr4
Silicon Power 32GB (2 x 16GB) DDR4 3200 (PC4 25600) 288-Pin XPOWER Turbine DDR4 SDRAM Desktop Memory Model SP032GXLZU320FDA - Newegg.com

newegg ddr5
Team Elite 32GB (2 x 16GB) DDR5 5200 (PC5 41600) Desktop Memory Model TED532G5200C42DC01 - Newegg.com
 

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
759
1,760
These "gaming" CPU just aren't very useful considering how PC gaming today is a straight up disaster.

So many games on PC have been absolute trash lately. Every game based on Unreal-engine using DX12 has major shader compilation stuttering.

These CPU don't fix that problem. And they're overpriced to boot.

Consoles cost a fraction of the price and don't have compilation stuttering. Nintendo will probably have a new Switch out next year, I'll just save some money for that instead.
 

doughillman

Distinguished
Feb 13, 2012
41
52
18,610
These "gaming" CPU just aren't very useful considering how PC gaming today is a straight up disaster.

So many games on PC have been absolute trash lately. Every game based on Unreal-engine using DX12 has major shader compilation stuttering.

These CPU don't fix that problem. And they're overpriced to boot.

Consoles cost a fraction of the price and don't have compilation stuttering. Nintendo will probably have a new Switch out next year, I'll just save some money for that instead.

Cool story bro.
 
These "gaming" CPU just aren't very useful considering how PC gaming today is a straight up disaster.

So many games on PC have been absolute trash lately. Every game based on Unreal-engine using DX12 has major shader compilation stuttering.

These CPU don't fix that problem. And they're overpriced to boot.

Consoles cost a fraction of the price and don't have compilation stuttering. Nintendo will probably have a new Switch out next year, I'll just save some money for that instead.
Cool story, bro! Does that mean you will no longer be commenting on EVERY. SINGLE. THREAD. regarding PC gaming? By now the majority of us understand that mid to high end PC components are out of your financial grasp. No need to poop on anyone who can afford such things.
 

UWguy

Commendable
Jan 13, 2021
68
54
1,610
Kinda silly to talk about Flight Simulator results but not include them.

Additionally if you must have a top of the line system you’re probably gaming in 4k in which case you’re not going to see any difference.
 
Here's my issue and a nutshell, basically no one who's purchasing a $600+ CPU is running 1080p so the results are simply misinformation and you could make arguments for 360p. Further, Intel can easily support DDR5 8000 or faster RAM so tell me why Intel should be handicapped to match slow AMD RAM speeds or for that matter be neutered or penalized from other limiting BIOS settings and power limits? Makes no sense other than to deceive people.

However, the most glaring real-world use case are the huge number of Streamers that purchase these 16, 24, etc core CPUs so they can save money and eliminate a second dedicated streaming PC. Therefore they're not only running a game that certainly can fully saturate 6-8 full cores but the remaining cores for applications like OBS, mixing apps, or you name it uploading HD in real-time to multiple services.

AMDs lobotomy line of 3D cache CPUs is a problem and AMD relies on Microsoft to manage CPU usage. The only version of these new 3D cache gaming CPUs that makes any sense whatsoever is the unreleased 8-core; AMD Ryzen™ 7 7800X3D.
 
  • Like
Reactions: rtoaht and UWguy
Kinda silly to talk about Flight Simulator results but not include them.

Additionally if you must have a top of the line system you’re probably gaming in 4k in which case you’re not going to see any difference.
The most glaring real world use case that buffogels my mind that no one wants to test apparently are for the Streamers that use a single solution PC; 6-8 cores are used by the game and remaining cores at least on Intel's monolithic die can be used for applications like OBS, mixing, etc. plus simultaneous uploading multi-HD streams, chat, communication, etc.

Also the i9-13900K can easily break DDR5-8000+ or faster speeds so please tell me why retard the performance needlessly just to make AMD look better and brush under the rug. AMDs glaring limitations? At higher resolutions like 1440p and especially 4K those higher RAM speeds easily adds 10 FPS or more. There's a lot of misinformation because of these artificial and misleading useless benchmarks and testing.
 
  • Like
Reactions: rtoaht and UWguy

willgart

Distinguished
Nov 27, 2007
139
9
18,685
These "gaming" CPU just aren't very useful considering how PC gaming today is a straight up disaster.

So many games on PC have been absolute trash lately. Every game based on Unreal-engine using DX12 has major shader compilation stuttering.

These CPU don't fix that problem. And they're overpriced to boot.

Consoles cost a fraction of the price and don't have compilation stuttering. Nintendo will probably have a new Switch out next year, I'll just save some money for that instead.
Console don't have to compile shaders, they are precompiled because there is only 1 (limited and outdated) hardware.
I can say console are ridiculous to navigate the web and write emails...
you'll save money without having a console + a computer, while 1 computer can do gaming + office or other activities more efficiently. ;-)
 
Aug 3, 2022
2
0
10
The article mentions how intel still supports DDR4 so many times (i even see it on the testing setup list) but I don't see any performance numbers for it. I have a 3600 kit. How much % would I be losing if I use that over the kit they're using for testing?
 

Elusive Ruse

Commendable
Nov 17, 2022
375
492
1,220
The article mentions how intel still supports DDR4 so many times (i even see it on the testing setup list) but I don't see any performance numbers for it. I have a 3600 kit. How much % would I be losing if I use that over the kit they're using for testing?
View: https://youtu.be/e5X0RuduRdU

Video is from 3 months ago so the DDR5 prices listed are not applicable anymore. The performance difference even for 12th gen equipped with DDR5 is substantial.
 

SunMaster

Prominent
Apr 19, 2022
156
133
760
You know which site you're reading at when title of the article is "AMD Ryzen 9 7950X3D vs Intel Core i9-13900K Faceoff: Battle of the Gaming Flagships" and the game-winning CPU isn't declared winner but it's "a draw".

The title implies, at least for most people, that the chips gaming capabilities are what is being tested.
 
Status
Not open for further replies.