News Ryzen 9 7950X3D Beats Intel Flagship By 11% In Leaked Gaming Benchmarks

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Being able to buy AMD and have the fastest possible have several upsides to the Intel offerings.
  • AMD CPU's use less power, so they are easier to cool = less noise and also less power costs (especially if you live where AC is needed to cool the room).
That's nonsense for gaming.
The 13900k uses 25W more than the 7950x for gaming, even with power limits lifted even when using ddr 5-7400, that doesn't make any change for cooling.
And the difference in gaming performance means you can run the intel cpu lower and still get the same performance as the 7950x.
The 13600k uses 10W less than the 7950x and is still faster.

If your argument was for the 3d cache models, the 5800x3d uses 47W ,10W less than the 12600k and they are both on the same level of performance, again no difference for the cooling.

If your argument was for high resolutions than it was completely wrong from the get-go since any CPU will be sitting on its thumbs when the GPU can't get out enough frames.

(Power measurement has not been done in 720p ,but that's just a guess, they don't specify)
https://www.techpowerup.com/review/intel-core-i9-13900k/17.html
OVY7H98.jpg
 
  • Like
Reactions: KyaraM

bit_user

Polypheme
Ambassador
That's nonsense for gaming.
The 13900k uses 25W more than the 7950x for gaming, even with power limits lifted even when using ddr 5-7400, that doesn't make any change for cooling.

...

(Power measurement has not been done in 720p ,but that's just a guess, they don't specify)
https://www.techpowerup.com/review/intel-core-i9-13900k/17.html
OVY7H98.jpg
The key point about the data you cited is that it's a 12-game average. That's the average per-game power (which can hide spikes that happen only some of the time) and averaged over multiple games. We all know that some games hit the CPU harder than others. If they play games at settings that aren't CPU-limited, then the additional power could be negligible. However, what games they play tomorrow are likely to differ from what they play today. And games that are GPU-limited won't be, if they upgrade their GPU to one that's much faster. So, the concern is relevant from a forward-looking perspective (which most buyers will have).

The relevance of this depends a lot on why someone is concerned about power. If you're concerned about electricity and air conditioning costs, then average power (for the specific games you play) is a good metric to look at. However, if you're concerned about what kind of cooling solution you'll need to avoid throttling during CPU-heavy periods or how high your CPU fan will spin up, then you really do care about the spikes, and averages obscure that.

Anandtech wrote an interesting article in which they compared performance at different power limits. For the most part, gaming was unaffected - very good news. However, there were definitely some outliers. The most extreme outlier was Total War: Warhammer 3, where the i9-13900K bogged down badly in 95th percentile performance, at any settings below stock. Interestingly, the 7950X was unfazed.
130794.png

Again, this is an outlier. Check the article for the rest of the games they tested, but we should keep in mind that as CPUs get faster, games tend to lean on them harder. So, the situation is only likely to get worse, on this front.

Also, from the same TechPowerUp article that you cited, we can see further details on the power variations per-game. These are the 12 games included in their average. Note how some CPU-lite games like CSGO, Borderlands 3, and AOE4 can compensate for CPU-heavy games like Civ 6, Cyberpunk 2077, and BF5:
power-per-game.png
Also, don't forget that these are still averaged over their entire benchmark. So, there will be spikes within each game that are higher than these figures.

IMO, it's malpractice for these sites to publish bare averages. They ought to convey the distribution, like with box-and-whisker plots or something.
 
Last edited:
Also, from the same TechPowerUp article that you cited, we can see further details on the power variations per-game. These are the 12 games included in their average. Note how some CPU-lite games like CSGO, Borderlands 3, and AOE4 can compensate for CPU-heavy games like Civ 6, Cyberpunk 2077, and BF5:
power-per-game.png
Yes, on a 125W base turbo power CPU 4 out of 12 games go above that level pushing the average to be high.
There are two sides to every coin.

And as far as the future goes, we have no idea how badly the games are coded for the hybrid nature of the intel CPUs, we have no idea if they are being loaded but their results discarded because they are slower than the rest, or if the games would run faster with the same power if the small cores wouldn't be used at all because then the rest could boost higher.
Devs might find tricks to make games run better on future CPUs or they might not.
 
  • Like
Reactions: KyaraM

msroadkill612

Distinguished
Jan 31, 2009
202
29
18,710
Yep, 7800x3d will be the killer product for gamers, but there is a market for productivity ~workstations that double as gamers.

Even employers like to incentivize talented workers. That expensive rig & gpu is only going to waste after hours.

A 12/16core AMD is an attractive workstation, & there is no real downside to the trifling 3d option.
 
This debate is as old as benchmarks go. Sure you don't need more than 60 fps to play smoothly, but high refresh monitors are here. Can your GPU deliver 120 fps at 4k in the latest games? Probably not, but it could do it at 1440p or 1080p, IF the CPU can too. That's why it is so important to isolate cpu performance with 1080p, just how fast can the CPU go? If you only need 60fps, just look at the same 1080p table and choose a lower cpu for you, it won't be slower at 4k. But those who need it faster must see how fast it can go.
That's a myth perpetrated by console gamers and/or pseudo PC gamers who've never played an online multiplayer first person shooter game.
 

msroadkill612

Distinguished
Jan 31, 2009
202
29
18,710
I've never seen nor heard of a company that's cool with its employees gaming on a corporate machine, but maybe in really small companies?
OK I will retract & buy out of that argument. The context was "incentivize" ie. its the bosses idea - not a sneaky employee goofing off - thats cheating.
Lets say self employed consultants then - engineers/architects/coders etc
 
That's a myth perpetrated by console gamers and/or pseudo PC gamers who've never played an online multiplayer first person shooter game.
Nah, they are all about 30FPS if not 24...because 'cinematic'
Anything above 60 is a 'nice to have' not a 'need to have' ,it makes a visual difference that is noticable if you switch between them but it doesn't make much of a difference.
Especially on online FPS shooters, because they run on a tick rate that is Independent of your FPS and most are on a tick rate of 60, you have to search and make an effort to find one that is on a higher tick rate.
 
  • Like
Reactions: KyaraM
Nah, they are all about 30FPS if not 24...because 'cinematic'
Anything above 60 is a 'nice to have' not a 'need to have' ,it makes a visual difference that is noticable if you switch between them but it doesn't make much of a difference.
Especially on online FPS shooters, because they run on a tick rate that is Independent of your FPS and most are on a tick rate of 60, you have to search and make an effort to find one that is on a higher tick rate.
I know this much ... when my FPS drop below 70 I can feel it .. as in laggy. Then again all I play is first person shooter games.
 

mac_angel

Distinguished
Mar 12, 2008
566
83
19,060
Sorry, but it seems you haven't understood what a bottleneck means.
If you're playing in 4K Ultra setting then the choice of CPU makes little to no difference, as long as it is not some relic or similar. The reason is that 4K gaming is pushing the limits of the GPU in a way, where most CPU's can keep up just fine.
When CPU's are tested with 1080p and fast graphics cards it is in order to compare the different CPU's, the reason games are used for benchmarking is due to many PC enthusiasts being more familiar with games than productivity benchmarks + there is also a few where 300 fps rather than 200 fps makes a difference (and a few more that tell them self it does).
I understand the bottleneck. But someone saying it in the forums is different than someone posting a proper review. A Core i9 10940X @ 5.1GHz all core does not match a Core i9 11900K @ 5.1GHz all core, even with using the same RAM and going from dual channel to quad channel systems; playing at 4K or higher.
 

mac_angel

Distinguished
Mar 12, 2008
566
83
19,060
How do you propose to benchmark just the CPU without the GPU influencing the results?
The system as a whole. They are benchmarking different CPUs, but with the same GPU. At 4K or higher, how much difference do these CPUs make? As I mentioned just before, I understand the bottleneck. But someone saying it in the forums is different than someone posting a proper review. A Core i9 10940X @ 5.1GHz all core does not match a Core i9 11900K @ 5.1GHz all core, even with using the same RAM and going from dual channel to quad channel systems; playing at 4K or higher.
 

bit_user

Polypheme
Ambassador
If you're playing in 4K Ultra setting then the choice of CPU makes little to no difference, as long as it is not some relic or similar.
In a later post, I included a graph which shows that CPU speed can make a difference in 4K gaming, in 95th percentile scores. And the reason we care about those is that when the scene gets really complex, the framerate drops, and the game feels generally laggy - that's your 95th percentile scenario. Having the game bog down for even 5% of the time can get you killed. If you're a non-gamer, perhaps you can now appreciate why the 95th percentile scores matter so much to gamers.

To put it another way, in realtime system design, you mostly want to focus on optimizing worst-case latency. That's what can kill playability. A high average framerate might not matter much, if the mean is being skewed by some stratospheric framerates in sections that were already fast.

there is also a few where 300 fps rather than 200 fps makes a difference (and a few more that tell them self it does).
Perceptually, I think framerate is roughly on log-scale. Put another way, you probably need an exponential increase in fps to make a linear perceptual difference. Going from 60 to 120 should feel similar to going from 144 to 280, assuming your monitor is fast enough to display all the frames.
 
  • Like
Reactions: helper800
Perceptually, I think framerate is roughly on log-scale. Put another way, you probably need an exponential increase in fps to make a linear perceptual difference. Going from 60 to 120 should feel similar to going from 144 to 280, assuming your monitor is fast enough to display all the frames.
Personally 30 => 60 is a world of difference. 60 => 120 is game changing. 120 =>165 is noticeable. 165 => 360 is perceptually indistinct to me.
 
  • Like
Reactions: bit_user

bit_user

Polypheme
Ambassador
Personally 30 => 60 is a world of difference. 60 => 120 is game changing. 120 =>165 is noticeable. 165 => 360 is perceptually indistinct to me.
There's another effect going on, where each bump up in framerate is introducing less and less additional information. You'd need tons of fast motion - like stuff flying left & right across the screen - and a really responsive monitor, to notice improvements at higher framerates.

At some point, I think we could say a really good motion blur would be pretty indistinguishable from yet higher framerates.
 
There's another effect going on, where each bump up in framerate is introducing less and less additional information. You'd need tons of fast motion - like stuff flying left & right across the screen - and a really responsive monitor, to notice improvements at higher framerates.

At some point, I think we could say a really good motion blur would be pretty indistinguishable from yet higher framerates.
I believe that basic premise is the reason why 120hz OLEDs look like 240hz TN/IPS/VA panels. OLEDs have 0.2ms GTG latency which gives a better perceived motion handling compared to other panel technologies.
 
Sorry, but it seems you haven't understood what a bottleneck means.
If you're playing in 4K Ultra setting then the choice of CPU makes little to no difference, as long as it is not some relic or similar. The reason is that 4K gaming is pushing the limits of the GPU in a way, where most CPU's can keep up just fine.
When CPU's are tested with 1080p and fast graphics cards it is in order to compare the different CPU's, the reason games are used for benchmarking is due to many PC enthusiasts being more familiar with games than productivity benchmarks + there is also a few where 300 fps rather than 200 fps makes a difference (and a few more that tell them self it does).

Well explained.