News Despite Low-Power Cores, Intel's Alder Lake Mobile Could Push Up to 115W TDP

It's even worse than this article makes it out to be. Intel's current 11th gen laptop parts are no where even close to Ryzen for power efficiency. Look at the base clocks for comparison:
Intel 11800h @ 45w runs at base of 2.3GHz vs Amd's 5800h base clock is 3.2GHz at the same 45watts. That's almost 1GHz faster, using the same power.

Intel has no choice but to boots their cores to go as fast as possible just to compete with Amd. This is why the boost clocks are so high and so extreme.

This is finally coming to the surface with Intel's desktop chips on B560 motherboards. Most can't run the high power chips because they can only put out approximately 100watts for a 65watt chip, because their boost behaviour draws more than 125watt for a 65 watt chip. They can't run any of Intel's high power chips within Specifications. And they throttle hard.
 
PL2 for Tiger Lake H is 109W for 8 cores. 115W for a 6+8 configuration isn't that much more. The performance will determine whether it is worth it or not. If it brings a new level of performance to laptops, ehh .. whatever. If it needs that much power just to compete with AMD, that's a problem. Either way, I doubt there are that many people in the market for that many cores in a laptop in the first place.
 
It's even worse than this article makes it out to be. Intel's current 11th gen laptop parts are no where even close to Ryzen for power efficiency. Look at the base clocks for comparison:
Intel 11800h @ 45w runs at base of 2.3GHz vs Amd's 5800h base clock is 3.2GHz at the same 45watts. That's almost 1GHz faster, using the same power.

Intel has no choice but to boots their cores to go as fast as possible just to compete with Amd. This is why the boost clocks are so high and so extreme.

This is finally coming to the surface with Intel's desktop chips on B560 motherboards. Most can't run the high power chips because they can only put out approximately 100watts for a 65watt chip, because their boost behaviour draws more than 125watt for a 65 watt chip. They can't run any of Intel's high power chips within Specifications. And they throttle hard.
Most entry level B560 boards can't run Intel's unlocked cpu's without throttling down due to poor VRM's. Those boards were made for locked cpu's. I've still yet to meet anyone who's purchased a $90 - $100 B560 board and paired it up with a $560 i9 11900K.
 
Intel 11800h @ 45w runs at base of 2.3GHz vs Amd's 5800h base clock is 3.2GHz at the same 45watts. That's almost 1GHz faster, using the same power.
Because intel has to account for AVX 512 power draw while AMD uses an inferior and much less power heavy version of AVX.

Also AMD hides a big amount of power draw by making it an mobo thing.
As steve found out 'Applications with high thread counts, and/or “heavy” threads, can encounter PPT limits' so AMD throttles as well if not being provided with enough power from the mobo.
https://www.gamersnexus.net/guides/3491-explaining-precision-boost-overdrive-benchmarks-auto-oc
Package Power Tracking (“PPT”):
The PPT threshold is the allowed socket power consumption permitted across the voltage rails supplying the socket. Applications with high thread counts, and/or “heavy” threads, can encounter PPT limits that can be alleviated with a raised PPT limit.

  1. Default for Socket AM4 is at least 142W on motherboards rated for 105W TDP processors.
  2. Default for Socket AM4 is at least 88W on motherboards rated for 65W TDP processors.
 
Up to 115W TDP (Intel) on mobile chips. Wow. Those chips will be a big challenge to cool - esp. with a GPU on the side. Could make thin laptops even more expensive with a need to put in so much more effort into cooling. Hope then that the 2+8+2/4+8+2 are great performers.
Given the atom cores will run cooler - what matters is the wattage in the end right?
Imagine the desktop versions in Desktop replacement laptops - gonna be behemoths. I know there isn't a big market for desktop replacement laptops, but I am one of those who is in that category, requiring a desktop and a desktop replacement, sadly.
 
Because intel has to account for AVX 512 power draw while AMD uses an inferior and much less power heavy version of AVX.

Also AMD hides a big amount of power draw by making it an mobo thing.
As steve found out 'Applications with high thread counts, and/or “heavy” threads, can encounter PPT limits' so AMD throttles as well if not being provided with enough power from the mobo.
https://www.gamersnexus.net/guides/3491-explaining-precision-boost-overdrive-benchmarks-auto-oc

Yea, there should be a standard that sets how a CPU's TDP should be shown to the consumer. AMD uses 'heat' to measure its TDP which is stupid for real.
But yea, AMD doesn't outright say that 105 or 65W is its max. draw, and neither does Intel. I though prefer the fact that AMD only jumps to 142W at PBO oc., for the performance it gives; compared to >190W (upto what 230W for the 11900K) for Intel 10th and 11th gen while showing a 125W TDP. That's a huge gap for Intel. Though Intel running power hungry or hot is not new info.
I don't get why both don't just publish the max power they can draw. Its gonna be published by 3rd party and discussed by consumers anyway.
 
I though prefer the fact that AMD only jumps to 142W at PBO oc., for the performance it gives; compared to >190W (upto what 230W for the 11900K) for Intel 10th and 11th gen while showing a 125W TDP. That's a huge gap for Intel. Though Intel running power hungry or hot is not new info.
You have to completely disable every safety feature for the mobo to even give you that much power and then you need to run a special software that has the sole purpose of drawing as much power as possible, it has nothing to do with how any CPU is supposed to be used and that's why they are not giving you a completely crazy and useless number, while the media is trying their hardest to make you believe that somehow this is relevant to you.

I don't get why both don't just publish the max power they can draw. Its gonna be published by 3rd party and discussed by consumers anyway.
Because that's just a hardware spec that would be, and maybe even is, buried somewhere next to the humidity it can work in an the min max temp.
 
AMD uses 'heat' to measure its TDP which is stupid for real.
TDP, or "Thermal Design Power", is intended as a measure of the heat output of a processor, to help give system builders an idea of what sort of cooling will be required. But since power is being converted to heat, it can also be used as an approximation of how much power a processor will be drawing under load, assuming the provided numbers are reasonably accurate.

I would say AMD's TDP numbers are fairly indicative of how much power their processors will draw under a typical all-core load. Not so much Intel's numbers though, since they switched to reporting their TDP at base clocks some years back, which is not particularly meaningful for modern systems, which by design will boost above base clocks even with all cores loaded. They do have a more meaningful PL2 number as well to cover power draw when boosting, but it's generally not advertised, undoubtedly to give the false impression that their processors are more efficient than they really are.
 
I would say AMD's TDP numbers are fairly indicative of how much power their processors will draw under a typical all-core load. Not so much Intel's numbers though, since they switched to reporting their TDP at base clocks some years back, which is not particularly meaningful for modern systems, which by design will boost above base clocks even with all cores loaded. They do have a more meaningful PL2 number as well to cover power draw when boosting, but it's generally not advertised, undoubtedly to give the false impression that their processors are more efficient than they really are.
You are missing the largest part of intel's definition of TDP.
It's not 'TDP at base clocks' it is 'TDP at base clocks with all cores active under an Intel-defined, high-complexity workload'
Thermal Design Power (TDP) represents the average power, in watts, the processor dissipates when operating at Base Frequency with all cores active under an Intel-defined, high-complexity workload. Refer to Datasheet for thermal solution requirements.
And that workload must contain avx512 at full blast because it's way way higher than any power draw any useful software reaches.
Hardware unboxed showed this beautifully in a recent video.
The 11400 is a 65W TDP CPU with a base clock of 2.6Ghz , running blender which is one of the apps using the highest possible power draw, while still producing something more then just heat, you get 3.3 to 3.5Ghz which is 30 to 35% more than base clocks depending on the board.
ajQNxav.jpg

The 11900k is a 125W TDP CPU with a base clock of 3.5Ghz, running at -50% TDP it runs -10% of base clocks which is still 40% better than it should run with your logic.
2X8ZEoy.jpg


And again 3D rendering is basically the highest power draw you will encounter in the wild, even apps used for stress testing and stability like prime will use much less power allowing the CPUs to run even higher clocks even locked down to TDP.
wXZFDyU.jpg

And with intel 65W TDP means 65W TDP, while for ryzen 65W TDP means 88W on the mobo.
The PPT threshold is the allowed socket power consumption permitted across the voltage rails supplying the socket. Applications with high thread counts, and/or “heavy” threads, can encounter PPT limits that can be alleviated with a raised PPT limit.

  1. Default for Socket AM4 is at least 142W on motherboards rated for 105W TDP processors.
  2. Default for Socket AM4 is at least 88W on motherboards rated for 65W TDP processors.
 
Well, it does indeed look bad. But we need to also see performance levels and how negative effect of lower TDP will be. It could look worse than it is. Or could be that Intel found themselves in same situation as Radeon with Vega. Where they made good architecture, but pushed it way past efficiency curve to meet performance targets. So efficiency is out of the window, but they undervolt pretty well. Or might also be that Intel didn't really fix all issues with 10nm and use overkill amount of power to increase yields. Who knows, it is just pure guessing from my side.

But I do think performance level will also be important here. But yeah, it doesn't look good .
 
  • Like
Reactions: Phaaze88
PPT is the maximum amount of power that can be delivered to the socket. So while you can argue a 65W TDP CPU with an 88W PPT limit is "hiding power draw", it's really no different than Intel saying an i7-10700K is a 95W CPU but it'll happily go up to 147W.
With intel you (or the mobo maker) have to disable the stock settings by removing power limits to reach this power draw for more than a few seconds, with ryzen 30% more on the mobo is the default setting, not by the mobo makers but that is the defined stock settings, mobo makers can and do increase PPT even more.
 
With intel you (or the mobo maker) have to disable the stock settings by removing power limits to reach this power draw for more than a few seconds, with ryzen 30% more on the mobo is the default setting, not by the mobo makers but that is the defined stock settings, mobo makers can and do increase PPT even more.

Just as Intel has its PPC and its power limits, tau control etc. as part of how its chip basically functions in regard to power (which in its case, it allows mobo companies to freely manage and accordingly build the mobo power phases, etc. for that), PPT in the case of AMD is just how the chip is made to function and the mobos' made to those specifications. Both cpu's have to request and draw power from the mobo. I just got to know enabling PBO means, you have indefinite PPT, but then it still is not much different in the end, than some z590 boards coming with PL and tau set to indefinite already.

Why are you on about 'power' use being hidden though, lol... If one here is not hidden, neither is the others'.
In the end, with regards to power what matters to the consumer - is their cooler good enough to get good performance and if that solution is going to cost more in comparison to another processor with less cooling requirements. Most people would think about investing $30+ into a better cooler or so, but not take into account, what the $10 a month (depending on use), the more power hungry processor might end up costing. And this is why power use is even discussed, isn't it? Its not a veil being put over consumers, lol. The common guy doesn't care, just some geeks who like to spend spare time to read up on tech no matter.
 
Last edited:
And again 3D rendering is basically the highest power draw you will encounter in the wild, even apps used for stress testing and stability like prime will use much less power allowing the CPUs to run even higher clocks even locked down to TDP.
wXZFDyU.jpg

And with intel 65W TDP means 65W TDP, while for ryzen 65W TDP means 88W on the mobo.
So, you are trying to convince us that Intel is actually reporting their TDP numbers more accurately than AMD, but do so using a chart that shows the whole system power draw of a "105 watt" 5950X being 81 watts lower than that of a "125 watt" 11900K at stock settings in a rendering workload? >_>

And of course, keep in mind that the 5950X has twice the cores and threads, and completed that rendering workload nearly twice as fast, despite the 11900K system drawing 45% more power at stock, though that's more related to the inefficiency of the 11900K rather than the inaccuracy of its advertised TDP. It doesn't make for a particularly good piece of evidence, in any case.

And even when comparing the 11900K to the "105 watt" 5800X with the same number of cores, the 11900K system drew 85 watts more power at stock, while the 5800X was still over 16% faster. Intel's 20 watt higher TDP equates to 85 watts more power being drawn by the system under that all-core workload at stock settings.

And then there's those adaptive boost numbers. Ouch. Especially considering that even with adaptive boost enabled, the stock 5950X system still performed that rendering workload over 60% faster, despite the 11900K system drawing over 2.3 times as much power. I guess it managed to roughly tie the stock 5800X in terms of performance with that enabled, but while drawing an additional 244 watts. Again, that may be less about TDP, but it's probably not a good idea to use 11900K data when trying to convince people that Intel's current processors are not bad compared to AMD's in terms of power draw and conforming to their advertised TDPs.

And checking Techpowerup's 11400F review, their test system with that processor installed drew more power at stock than their 5600X system at both single-threaded and multi-threaded workloads, as well as at idle, despite both having a "65 watt" TDP. Those are all whole-system power draw measurements, so I don't see where you get the idea that AMD is somehow "hiding" CPU power use on the motherboard. And again, the stock 5600X offered over 1.5 times the performance of the stock 11400F at that particular rendering workload, despite drawing less power, so the 11400F system didn't come anywhere remotely close to it in terms of efficiency.
 
You are missing the largest part of intel's definition of TDP.
It's not 'TDP at base clocks' it is 'TDP at base clocks with all cores active under an Intel-defined, high-complexity workload'
link needed for this, intel sets its TDP at base clocks, NOT at base clocks with all cores active, " For any given processor, Intel will guarantee both a rated frequency to run at (known as the base frequency) for a given power, which is the rated TDP. This means that a processor like the 65W Core i7-8700, which has a base frequency of 3.2 GHz and a turbo of 4.7 GHz, is only guaranteed to be at or below 65W when the processor is running at 3.2 GHz. Intel does not guarantee any level of performance above this 3.2 GHz / 65W value. "

which is from this article from AT, while it is a couple of years old, i doubt intel changed the behavior of its cpus much, if at all
https://www.anandtech.com/show/13544/why-intel-processors-draw-more-power-than-expected-tdp-turbo
i would trust AT over anything terrylaze says. as most of the time, he cherry picks specific graphs and test that " show " he is right.

intel uses more power over all then amd, plain and simple.

Cryoburner, thats just him cherry picking benchmarks, and graphs that he then uses to prove his points, after all, he loves intel to no end, and believes intels PR BS.
 
So, you are trying to convince us that Intel is actually reporting their TDP numbers more accurately than AMD, but do so using a chart that shows the whole system power draw of a "105 watt" 5950X being 81 watts lower than that of a "125 watt" 11900K at stock settings in a rendering workload? >_>
The 11900k TDP during boost is stated as 251W at stock and the whole system draws 260W, so either the whole system uses just 9W or the 11900k uses much less power than stated for boosting.
The 5950x 105W TDP CPU uses 179W whole system for rendering.
In prime95 the 11900k uses 203W (251W rated) and the 5950x uses 194 (105W rated?? does AMD state a boost TDP anywhere? ) ,it's not about the difference between the two but about how far away from the stated numbers they are.
And checking Techpowerup's 11400F review, their test system with that processor installed drew more power at stock than their 5600X system at both single-threaded and multi-threaded workloads, as well as at idle, despite both having a "65 watt" TDP. Those are all whole-system power draw measurements, so I don't see where you get the idea that AMD is somehow "hiding" CPU power use on the motherboard. And again, the stock 5600X offered over 1.5 times the performance of the stock 11400F at that particular rendering workload, despite drawing less power, so the 11400F system didn't come anywhere remotely close to it in terms of efficiency.
Yeah I looked at it and and the highest difference is 4W and one of them is in favor of the 11400.
Sure in all other benches the 11400 is behind but again for a maximum difference of 4W.
If intel really uses so much more power than the stated 65W how can it be possible for them to be so close in total system power?!
https://www.techpowerup.com/review/intel-core-i5-11400f/20.html
50-53
75-79
126-130
134-130
link needed for this, intel sets its TDP at base clocks, NOT at base clocks with all cores active, " For any given processor, Intel will guarantee both a rated frequency to run at (known as the base frequency) for a given power, which is the rated TDP. This means that a processor like the 65W Core i7-8700, which has a base frequency of 3.2 GHz and a turbo of 4.7 GHz, is only guaranteed to be at or below 65W when the processor is running at 3.2 GHz. Intel does not guarantee any level of performance above this 3.2 GHz / 65W value. "

which is from this article from AT, while it is a couple of years old, i doubt intel changed the behavior of its cpus much, if at all
https://www.anandtech.com/show/13544/why-intel-processors-draw-more-power-than-expected-tdp-turbo
i would trust AT over anything terrylaze says. as most of the time, he cherry picks specific graphs and test that " show " he is right.

intel uses more power over all then amd, plain and simple.

Cryoburner, thats just him cherry picking benchmarks, and graphs that he then uses to prove his points, after all, he loves intel to no end, and believes intels PR BS.
Yeah that's really cute, you accuse me of cherry picking but then instead of looking up what intel states as TDP you cherry pick the definition of a third party and two year old article because that's more to your liking...
I didn't give a link because that's what intel states on their site, it would be like linking to the ark page every time you quote any intel CPU.
pTWJVAm.jpg

Also the definition that anandtech gives is the exact same as intel, just worded in the most confusing way possible to make it sound bad.

Both definitions mean that with the most demanding software possible you will only get base clocks with the rated TDP.
 
Last edited:
Yeah that's really cute, you accuse me of cherry picking but then instead of looking up what intel states
i think its even cuter that you believe intel over a non bias, 3rd party.
you cherry pick the definition of a third party and two year old article because that's more to your liking...
you mean the same thing you do ?

and i will STILL believe AT over anything you say, or post, as you tend to twist, and cherry pick a graph, or phrase that shows you are right in that one use case, and you have shown, you worship intel to no end, and continue to believe what they say. as your understanding of intels TDP, pretty much goes against what most others say, as the AT article shows, its at base clocks, and most other sites all same the same thing, intel uses more power over all then amd, you seem to be pretty much the only one that says other wise. where do you think waste of sand/silicon, power hungry, pathetic, etc comes from when talking about intels cpus as of late ? from AT's review of the 11700k : " Rocket Lake also gets you PCIe 4.0, however users might feel that is a small add-in when AMD has PCIe 4.0, lower power, and better general performance for the same price. " even toms own review, says the same " While the Core i5-11600K may not claim outright supremacy in all benchmarks, its mixture of price and performance makes it a solid buy if you're willing to overlook the higher power consumption. " " For gamers, the Core i9-11900K would have to show a more appreciable advantage to justify its price tag and power consumption "

Against

  • - High pricing
  • - High power consumption
  • - Needs CPU cooler
  • - Eight cores
  • - Lackluster threaded performance
  • - Gear 2 Memory Mode
  • - Limited PCIe 4.0 Support
even GN's reviews of the 11700k and 11900k were NOT very nice towards intel, and said the opposite of what you keep claiming, intel uses less power over all then amd. which is quite false. specially if the motherboard removes the limits for TAU. this is talked about in the 11700k review at approx 20 mins into the review. the GN review for the 11900k, didnt also not have much to say about it that was nice. over all, intel uses more power, even compared to AMD's cpus that have more cores.

this isnt cherry picking, this is fact in the reviews conclusions of intels cpus.