News Ryzen 9 7950X3D Beats Intel Flagship By 11% In Leaked Gaming Benchmarks

Much better than the last leak which was only 6% over 13900K. Except that was from AMD's own review guide. 18% over 7950X is also a good showing. That could mostly wipe away scenarios where 13900K/13900KS is the leader (often with expensive DDR5).

Next up is the 7800X3D. 7800X3D maintaining similar performance and beating 13900K/13900KS in gaming would look better for AMD. But it should fall back a little if any games are correctly prioritizing the non-3D chiplet.

7900X3D is too closely priced to 7950X3D.
 
  • Like
Reactions: atomicWAR
And why would anyone be surprised. This is almost like AMD leaked this benchmark. They choose games like tomb raider and far cry that the 5800x3d is close to 13900k. If the new cpu's did nor do a lot better than a 5800x3d why would you even consider it.
I am sure someone from intel could come up with a list of games where these cpu chips perform worse than both the AMD non x3d version as well as a 13900k.

We need to see a much larger list of games and games of different types. Since I don't tend to play that many older games that the benchmarks use I would really like to see a way to predict if a new game will benefit or not.
 
I never understood those kind of benchmarks. While I get the concept of dropping down the resolution to 1080p to take the GPU bottleneck out of the equation, it really doesn't. No one is buying this kind of hardware, playing those kinds of games, to play at 1080p. I'm more interested in knowing how it would work in a real world gaming situation. Does it still have an average 11% increase at 4K? Maybe, maybe not. But 11% could easily mean 54fps to 60fps where it would have a much greater difference and meaning.
 
  • Like
Reactions: TheJoker2020
I never understood those kind of benchmarks. While I get the concept of dropping down the resolution to 1080p to take the GPU bottleneck out of the equation, it really doesn't. No one is buying this kind of hardware, playing those kinds of games, to play at 1080p. I'm more interested in knowing how it would work in a real world gaming situation. Does it still have an average 11% increase at 4K? Maybe, maybe not. But 11% could easily mean 54fps to 60fps where it would have a much greater difference and meaning.

This is exactly my thinking. Few people actually aim to play games at 600fps to shoot the other guy first, when you are actually better off spending your money upgrading your router and ISP and running at a mere 300fps in your 1080p shoot-em-up, in those scenarios, lower LAN/Internet latency is king.

For the rest of us, 4K gaming, or a high-res ultrawide setup is what people want to see tested, and they will be tested for sure. I can honestly wait a couple of days to find out real-world examples, not that I am about to upgrade my CPU anytime soon, first I need to upgrade my 4K monitor to one that does 120+ Hz and a GPU that can keep up, and I need to rob a bank to pay for it.

I just want to see what this new tech can do, and the limitations of two different compute dies in the same CPU, this will give some foresight into future AMD designs, one of which has long been rumoured to use a Zen 5 compute die (probably with 3DV Cache, and a "Zen 4C" die without, thus being a more comparable competitor to Intel with a bigLITTLE CPU design.
 
Last edited:
This is exactly my thinking. Few people actually aim to play games at 600fps to shoot the other guy first, when you are actually better off spending your money upgrading your router and ISP and running at a mere 300fps in your 1080p shoot-em-up, in those scenarios, lower LAN/Internet latency is king.

For the rest of us, 4K gaming, or a high-res ultrawide setup is what people want to see tested, and they will be tested for sure. I can honestly wait a couple of days to find out real-world examples, not that I am about to upgrade my CPU anytime soon, first I need to upgrade my 4K monitor to one that does 120+ Hz and a GPU that can keep up, and I need to rob a bank to pay for it.
If you're playing multi-player first person shooter games at 4K then save your money for a 4090.
 
If you're playing multi-player first person shooter games at 4K then save your money for a 4090.

I am not even considering such a thing. My internet is shockingly bad and there is no upgrade option yet, so I am well out of the scene of multiplayer games, especially "fast ones". that will be something to look into when I can upgrade my internet. The only online game I play now is WoWS, which rans at a flat 60Hz at 4K all dialled up, and then my ping jumps from 35 to 150, 300, 600 --- recovers or kicks me.!!! Decent internet is currently a pipedream 🙁
 
  • Like
Reactions: Amdlova and Why_Me
I am not even considering such a thing. My internet is shockingly bad and there is no upgrade option yet, so I am well out of the scene of multiplayer games, especially "fast ones". that will be something to look into when I can upgrade my internet. The only online game I play now is WoWS, which rans at a flat 60Hz at 4K all dialled up, and then my ping jumps from 35 to 150, 300, 600 --- recovers or kicks me.!!! Decent internet is currently a pipedream 🙁
And I thought we had it bad here in Alaska.
 
  • Like
Reactions: TheJoker2020
This is exactly my thinking. Few people actually aim to play games at 600fps to shoot the other guy first, when you are actually better off spending your money upgrading your router and ISP and running at a mere 300fps in your 1080p shoot-em-up, in those scenarios, lower LAN/Internet latency is king.

For most people upgrading the components in their computer is much easier than going from a Cable ISP to a Fiber one.

And upgrading a router won't do anything for you if your ISP is crap.
 
The problem is price. If you have to pay $100 more for the 7800X3D vs a 13900K and another $100 (or more) for the motherboard compared to a DDR5 Intel system, then you're talking the difference between a GPU tier and a far greater performance gain.

Did you mean 7950X3D? The 7800X3D's MSRP is about $100 less than what 13900K(F) is selling for.

Maybe all of these X3D prices need to come down, or maybe gamers will pay a premium for small leads over 13900K(S).

The 7000X3D system could save some money by using slower DDR5 memory. The speed matters less with 3D cache.
 
The 7000X3D system could save some money by using slower DDR5 memory. The speed matters less with 3D cache.

Does it basically boil down to larger "buffer" in cache = fewer delays in waiting for RAM? Do I have that correct? It'll be very interesting to see how different RAM speeds affect the X3D lineup in terms of game performance.
 
Do you have any satellite internet options? 5g is pretty fast and considering your ping spikes, it may be even better latency.
The cost is insane, literally 5-15x as much for sattelite or 5G, with no 5G option that is unlimited, and in my area not much or an improvement... Both got a pass a long time ago.! When Fibre arrives I will have it installed, I used to have 100Mb internet when most people had 5-30, then I dropped down to 5 to live in a nice area.

A decade without gaming quality broadband, and I wouldn't even know where to start for the games that require low latency / fat bandwidth as I have been happy with the single person games (that I prefer to multiplayer anyway, what I look forward to the most is being able to download a new game in a sensible time frame and without annoying everyone else. As it is, in realistic terms, Hogwarts "A Legacy" would take a literal MONTH to download with current parameters so that I don't disturb other users 😱

With those same parameters I could download it in 2 days or less with a 100Mb speed, and i can get 500Mb 😱 😀 cant wait
 
I never understood those kind of benchmarks. While I get the concept of dropping down the resolution to 1080p to take the GPU bottleneck out of the equation, it really doesn't. No one is buying this kind of hardware, playing those kinds of games, to play at 1080p.

How do you propose to benchmark just the CPU without the GPU influencing the results?
 
This debate is as old as benchmarks go. Sure you don't need more than 60 fps to play smoothly, but high refresh monitors are here. Can your GPU deliver 120 fps at 4k in the latest games? Probably not, but it could do it at 1440p or 1080p, IF the CPU can too. That's why it is so important to isolate cpu performance with 1080p, just how fast can the CPU go? If you only need 60fps, just look at the same 1080p table and choose a lower cpu for you, it won't be slower at 4k. But those who need it faster must see how fast it can go.
 
  • Like
Reactions: King_V
I think I'm still going to wait until I see some benchmarks someone is willing to put their name too.

Leaks are nice and all to create a buzz about something but I'm not buying based on them!
 
I never understood those kind of benchmarks. While I get the concept of dropping down the resolution to 1080p to take the GPU bottleneck out of the equation, it really doesn't. No one is buying this kind of hardware, playing those kinds of games, to play at 1080p. I'm more interested in knowing how it would work in a real world gaming situation. Does it still have an average 11% increase at 4K? Maybe, maybe not. But 11% could easily mean 54fps to 60fps where it would have a much greater difference and meaning.
Sorry, but it seems you haven't understood what a bottleneck means.
If you're playing in 4K Ultra setting then the choice of CPU makes little to no difference, as long as it is not some relic or similar. The reason is that 4K gaming is pushing the limits of the GPU in a way, where most CPU's can keep up just fine.
When CPU's are tested with 1080p and fast graphics cards it is in order to compare the different CPU's, the reason games are used for benchmarking is due to many PC enthusiasts being more familiar with games than productivity benchmarks + there is also a few where 300 fps rather than 200 fps makes a difference (and a few more that tell them self it does).
 
  • Like
Reactions: Roland Of Gilead
Did you mean 7950X3D? The 7800X3D's MSRP is about $100 less than what 13900K(F) is selling for.

Maybe all of these X3D prices need to come down, or maybe gamers will pay a premium for small leads over 13900K(S).

The 7000X3D system could save some money by using slower DDR5 memory. The speed matters less with 3D cache.
Being able to buy AMD and have the fastest possible have several upsides to the Intel offerings.
  • AMD CPU's use less power, so they are easier to cool = less noise and also less power costs (especially if you live where AC is needed to cool the room).
  • One will be able to upgrade the CPU on an AM5 board down the line, so less e-waste, less cost and much less hassle when upgrading than with Intel (No need to re-install Windows...).
As for what memory is needed, best wait for benchmarks as it could be DDR5 speed still matters - cache isn't everything.
 
  • Like
Reactions: jp7189
Did you mean 7950X3D? The 7800X3D's MSRP is about $100 less than what 13900K(F) is selling for.

Maybe all of these X3D prices need to come down, or maybe gamers will pay a premium for small leads over 13900K(S).

The 7000X3D system could save some money by using slower DDR5 memory. The speed matters less with 3D cache.

There will be some like myself who upgrade right away as I have other systems my current 7950X will end up in as planned but when the 3d v-cache chips hit holiday sales next fall my guess is price cuts will be ample therefore budget buyers will swarm to these chips (cough 7800X3D cough cough). A lot of folks buy the last gen stuff in this manner much like the 5800XD sold like hot cakes this last fall/winter. I suspect we'll see much of the same here.