News Intel Core i5-11400 vs AMD Ryzen 5 3600: Budget Gaming CPU Face-off

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

InvalidError

Titan
Moderator
thermals could have been improved with a $30 HSF.
From the contact paper test, thermals could have been greatly improved simply by telling CoolerMaster to fix their existing HSF design.

If the copper slug surface was simply slightly convex instead of concave, it would both fix the poor center area contact issue and increase contact pressure without changing anything else. Total cost: a few minutes of CAD time to re-program the CNC lathe that turns those slugs and maybe a few more minutes to CAD a replacement press tool to make sure it doesn't cave the slug face in during press-fitting if the original press tool isn't already designed that way..
 

InvalidError

Titan
Moderator
Ya because people are going to purchase a $90 entry level budget B560 board for a $550 unlocked 11900K. Seriously that guy is about the last person on this planet I'd ask for PC advice.
Steve's beef isn't that a $90 motherboard fails to run an unlocked CPU within specs. It is that a $90 motherboard that EXPLICITLY lists the high-end CPU in its supported CPU list can't support baseline spec for that CPU.

He would have been perfectly fine with a crappy board being crap with high-end CPU if the crap motherboard didn't imply support for high-end CPUs by including them in their supported CPU list.
 
Steve's beef isn't that a $90 motherboard fails to run an unlocked CPU within specs. It is that a $90 motherboard that EXPLICITLY lists the high-end CPU in its supported CPU list can't support baseline spec for that CPU.

He would have been perfectly fine with a crappy board being crap with high-end CPU if the crap motherboard didn't imply support for high-end CPUs by including them in their supported CPU list.
Agreed Asrock shouldn't list supporting unlocked cpu's for that board but he acts like he just discovered what happened to Amelia Earhart.
 
Steve's beef isn't that a $90 motherboard fails to run an unlocked CPU within specs. It is that a $90 motherboard that EXPLICITLY lists the high-end CPU in its supported CPU list can't support baseline spec for that CPU.

He would have been perfectly fine with a crappy board being crap with high-end CPU if the crap motherboard didn't imply support for high-end CPUs by including them in their supported CPU list.
Within specs would mean intel specs and not whatever he comes up with to support his claims.

Rendering consumes considerably more power than what intel considers spec.
He would have to use intel XTU stress test to do this test for him to be able to claim that they are not living up to spec.


doQfNIZ.jpg
 
Ya because people are going to purchase a $90 entry level budget B560 board for a $550 unlocked 11900K. Seriously that guy is about the last person on this planet I'd ask for PC advice.
But you can do that with AMD if you go for a cheap 3600 and then want a 5800X or a 5900X! Not sure about a 5950X, but maybe even then you can get away with it using a cheap B550, but maybe not quite the bottom of the barrel.

Plus, InvalidError already pointed out the obvious part of the video, so I won't repeat it.

On another note, I find funny how it seems Intel needs unpaid supporters for what is clearly their own created kerfuffle. I wish AsRock and Gigabyte fix their stuff so less knowleadgable consumers don't fall into the trap and Intel stops playing that TDP double standard with the power. Same with AMD, mind you.

Regards.
 

InvalidError

Titan
Moderator
Within specs would mean intel specs and not whatever he comes up with to support his claims.
The Intel spec is extremely broad and the thing Steve was ranting about is that the ASRock board claims to support the 11900k yet cannot even handle BASE CLOCK - the lowest clock the CPU is supposed to still be able to run at under worst-case conditions. The board cannot handle the bare minimum spec for that CPU.
 
The Intel spec is extremely broad and the thing Steve was ranting about is that the ASRock board claims to support the 11900k yet cannot even handle BASE CLOCK - the lowest clock the CPU is supposed to still be able to run at under worst-case conditions. The board cannot handle the bare minimum spec for that CPU.
Sure, absolutely right. *

Still, it only affects people that will be using their PC extensively for rendering since anything else will use less power and as such will reach base clocks and even higher.

*Depending on how you see it, because if you look at the video he used a low air flow scenario possibly choosing the worst positioning for cooling the VRMs and he gives no info on what asrock suggests as proper cooling for the case or VRMs.
It's completely possible that asrock states the cooling demands somewhere and that would be enough for this not to be false advertising anymore.
 

InvalidError

Titan
Moderator
Still, it only affects people that will be using their PC extensively for rendering since anything else will use less power and as such will reach base clocks and even higher.
If the board is already VRM-throttling before hitting the all-cores base clock, then it will also have severe VRM-throttling in everything else too and game performance won't be anywhere near expected either.
 
If the board is already VRM-throttling before hitting the all-cores base clock, then it will also have severe VRM-throttling in everything else too and game performance won't be anywhere near expected either.
If everything had the same amount of power draw...
3d rendering is easily 20-30% above anything else.

Here cinebench has a 30% higher power draw then prime95 which is a stability/stress test...lower power draw will cause lower VRM temp and higher clocks.
https://www.techpowerup.com/review/intel-core-i9-11900k/21.html
wXZFDyU.jpg
 
But you can do that with AMD if you go for a cheap 3600 and then want a 5800X or a 5900X! Not sure about a 5950X, but maybe even then you can get away with it using a cheap B550, but maybe not quite the bottom of the barrel.

Plus, InvalidError already pointed out the obvious part of the video, so I won't repeat it.

On another note, I find funny how it seems Intel needs unpaid supporters for what is clearly their own created kerfuffle. I wish AsRock and Gigabyte fix their stuff so less knowleadgable consumers don't fall into the trap and Intel stops playing that TDP double standard with the power. Same with AMD, mind you.

Regards.
Not sure what Intel has to do with the board manufacturers setting their own specs.
 
Not sure what Intel has to do with the board manufacturers setting their own specs.
Nothing; the video was to just beat the already dead horse of power usage on Intel CPUs nowadays. That is completely AsRock's and Gigabyte's fault for advertising support for CPUs that can't even meet the minimum spec/speeds advertised on the CPUs boxes.

What I could say about this, as mentioned above, is that Intel facilitates this kerfuffle by having the "65W" moniker on CPUs that you just shouldn't say are 65W; period. If you offer a way for the motherboard to spike power consumption for the sake of looking good against the competition, then they should just come clean and, more importantly, demanding AIBs and OEMs to meet the spec a bit more rigurously. I know for a fact AMD has been chased about this in the past as well, but I don't see this happening with Intel.

Also, HUB* makes this statement very eloquently as well (and better worded). Also, as already mentioned, AMD is also guilty of this, but luckily for them their CPUs are now using less power than what their Intel counter parts need to in order to look good.

The 11400*/F siblings are not free of these asterisks either.

Regards.
 
Last edited:
What I could say about this, as mentioned above, is that Intel facilitates this kerfuffle by having the "65W" moniker on CPUs that you just shouldn't say are 65W; period. If you offer a way for the motherboard to spike power consumption for the sake of looking good against the competition,
Intel's 65w includes the heaviest thing you could run on the CPU and will guarantee you at least base clocks at safe temps using a 65W cooler.
Which is why a 65W TDP CPU can run 30-35% higher than base clocks even with a heavy workload like 3d rendering, which itself is about 30% higher load than what a normal person would ever run.
ajQNxav.jpg

AMD's 65W TDP means 65W at lighter loads while heavier things will use up to 88W
Package Power Tracking (“PPT”): The PPT threshold is the allowed socket power consumption permitted across the voltage rails supplying the socket. Applications with high thread counts, and/or “heavy” threads, can encounter PPT limits that can be alleviated with a raised PPT limit.

  1. Default for Socket AM4 is at least 142W on motherboards rated for 105W TDP processors.
  2. Default for Socket AM4 is at least 88W on motherboards rated for 65W TDP processors.
I don't know if there's any horse left, but... Let's just keep hitting the ground are where it was dead!

View: https://www.youtube.com/watch?v=A7qthhhP-ho


Cheers!
Great video... /s
The only reason he hates on intel is that it crashed with tomb raider and that's the game he doesn't show running on amd...it's not a double standard at all.
 
  • Like
Reactions: Why_Me
Intel's 65w includes the heaviest thing you could run on the CPU and will guarantee you at least base clocks at safe temps using a 65W cooler.
Which is why a 65W TDP CPU can run 30-35% higher than base clocks even with a heavy workload like 3d rendering, which itself is about 30% higher load than what a normal person would ever run.
ajQNxav.jpg

AMD's 65W TDP means 65W at lighter loads while heavier things will use up to 88W
I'm not saying AMD doesn't do it; you ommited that from the quote (hence why I hate it when people doesn't quote the whole post!) and they have been criticized for it, but when they went more efficient than Intel, suddenly no one cared anymore. Talk about double standards! Also, at least those power limits are known for AMD. Does Intel list a power ceiling for their CPUs anywhere?

Just like HUB made the point, very eloquently, I'll repeat it for you as you may have ignored it completely (paraphrasing though): "you can slap any TDP you want on the box if you lower the clocks enough". This is what Intel is doing and NO ONE is calling them out on that. WHY?

Great video... /s
The only reason he hates on intel is that it crashed with tomb raider and that's the game he doesn't show running on amd...it's not a double standard at all.
No, he actually complained on the whole premise of Intel putting out a half-baked product. Which, you know, it's totally fair. The expectation of something is to work, no? Also, Tomb Raider is not a game from an Indy Dev or a small publisher; it's actually well known and used by several reviewers (maybe not that same exact version? IDK), so the expectation is for it to "just work". Yes, it was an unfortunate coincidence, but it's not the only complain. As I said, there's no horse left, so let's hit the ground where it was instead!

Regards.
 
I'm not saying AMD doesn't do it; you ommited that from the quote (hence why I hate it when people doesn't quote the whole post!) and they have been criticized for it, but when they went more efficient than Intel, suddenly no one cared anymore. Talk about double standards! Also, at least those power limits are known for AMD. Does Intel list a power ceiling for their CPUs anywhere?
Yes intel publishes the power limits in their data sheets.
Here is an easier to look through list.
We know what intel allows for maximum boost TDP (pl2) and we also know the TAU duration of how long intel allows if a mobo pushes harder than that then that is overclocking and no company ever gave out specs for overclocking.
Do we have boost TDP numbers from AMD?!
Which numbers do you mean when you say " Also, at least those power limits are known for AMD." , do you mean PPT? because that's not for boosting it's just as a reserve to run harder to run software at base clocks.
Just like HUB made the point, very eloquently, I'll repeat it for you as you may have ignored it completely (paraphrasing though): "you can slap any TDP you want on the box if you lower the clocks enough". This is what Intel is doing and NO ONE is calling them out on that. WHY?
Well let me repeat the picture in case you didn't got it.
A 65TDP CPU clocks 30-35% higher than base on 65W locked with 3d rendering.
How much higher than base does ryzen boost if PPT is locked down to 65W in 3d rendering?
No, he actually complained on the whole premise of Intel putting out a half-baked product. Which, you know, it's totally fair. The expectation of something is to work, no? Also, Tomb Raider is not a game from an Indy Dev or a small publisher; it's actually well known and used by several reviewers (maybe not that same exact version? IDK), so the expectation is for it to "just work". Yes, it was an unfortunate coincidence, but it's not the only complain. As I said, there's no horse left, so let's hit the ground where it was instead!

Regards.
So if I go through my steam library and find a single AAA game that doesn't run straight away on ryzen but find 4 different games that run fine on intel and make a video about it you are going to support it fully....got it.

Also he probably pulled tomb raider from his main rig already configured for his high-end GPU and instead of running the configuration option first he just hit the play button and went 'omg I cant be bothered having a brain, game no running'
 
Yes intel publishes the power limits in their data sheets.
Here is an easier to look through list.
We know what intel allows for maximum boost TDP (pl2) and we also know the TAU duration of how long intel allows if a mobo pushes harder than that then that is overclocking and no company ever gave out specs for overclocking.
Do we have boost TDP numbers from AMD?!
Which numbers do you mean when you say " Also, at least those power limits are known for AMD." , do you mean PPT? because that's not for boosting it's just as a reserve to run harder to run software at base clocks.
That's a good point. I know it's 142W because of the motherboard and tech review sites, but I don't think I've ever seen a site* like Intel's Ark where they list all the information about their CPUs. It would be good if they did have something like that indeed.
Well let me repeat the picture in case you didn't got it.
A 65TDP CPU clocks 30-35% higher than base on 65W locked with 3d rendering.
How much higher than base does ryzen boost if PPT is locked down to 65W in 3d rendering?
I think the TDP locks the base clocks on Ryzen CPUs if you don't run PBO, so it goes up-to TDP power. They're running close to their power limit already, so their TDP is reflective of the all-core boost usually (usually). Exception being the 3700X which is TDP 65W and it can still go all the way to 142W with PBO, or higher if you override the PPT. Like I said, AMD is also guilty of it, so no surprises there?

Also, I strongly* believe you're not getting it?
So if I go through my steam library and find a single AAA game that doesn't run straight away on ryzen but find 4 different games that run fine on intel and make a video about it you are going to support it fully....got it.

Also he probably pulled tomb raider from his main rig already configured for his high-end GPU and instead of running the configuration option first he just hit the play button and went 'omg I cant be bothered having a brain, game no running'
On your first part, yes, I would. Why would I be happy with a product that doesn't work as advertised? I'm not a stupid fanboi.

On your second part, no. That is unjustly assuming things about the person that aren't reflected in his video. As they say "that says more about you than them".

Regards.