News Intel Core i9-14900KS alleged benchmarks leaked — up to 6.20 GHz and 410W power draw

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
But really 410w sounds right for something pre-production and set high enough to cover as wide of a number of chips as possible. AMD's even worse with their Turbo voltages.
If pre-production has anything to do with it then the final product will be able to go even higher.
How much watts ( which is volts times amps) a die can survive without exploding is a sign of how good the node is.
In simple words if a die explodes at low voltage and amp settings, so at low watts, that would be a bad thing.
 
Intel is the best at WiFi/BT, LAN, USB/Thunderbolt, and their drivers.
Are you joking?
There were severe issues with several recent generations of Intel LAN (while Realtek ones worked without any issues).
Intel Wi-Fi AX200 refused to connect at Wi-Fi 6 in some countries for years. "really awful" RZ608 worked flawlessly for me.

Nowadays you would be better with Realtek/Mediatek over Intel.
 
  • Like
Reactions: bit_user
Are you joking?
There were severe issues with several recent generations of Intel LAN (while Realtek ones worked without any issues).
Spot on.

Not only is the I225-V buggy, but the problems didn't end there. It turns out even the subsequent I226-V has bugs!

SMH. Oh, how the mighty have fallen... never thought I'd see the day when Realtek seemed a safer option than an Intel NIC.
 
  • Like
Reactions: Nyara and CelicaGT
Spot on.

Not only is the I225-V buggy, but the problems didn't end there. It turns out even the subsequent I226-V has bugs!

SMH. Oh, how the mighty have fallen... never thought I'd see the day when Realtek seemed a safer option than an Intel NIC.
I have the I225-V on my B550 board. It's a piece of *.
 
Last edited by a moderator:
  • Like
Reactions: Nyara and bit_user
Spot on.

Not only is the I225-V buggy, but the problems didn't end there. It turns out even the subsequent I226-V has bugs!

SMH. Oh, how the mighty have fallen... never thought I'd see the day when Realtek seemed a safer option than an Intel NIC.
The i226-V problem is a Windows/Windows driver bug of some sort the hardware itself is fine.
 
Nope. The chip still has hardware bugs. The driver "fixes" only consist of mitigations that are achieved by disabling features.
The problems with the i226-V only exist on Windows. They supposedly resolved everything with the latest Win10/11 drivers, but nothing I have with i226-V runs Windows so I can't check easily. Might try it once I get around to setting up my new router box.
 
The problems with the i226-V only exist on Windows. They supposedly resolved everything with the latest Win10/11 drivers, but nothing I have with i226-V runs Windows so I can't check easily. Might try it once I get around to setting up my new router box.
If you were very invested in the matter, it'd be interesting to check the git log for the Linux driver. I'll bet we would find that it has the same sort of workarounds that went into the Windows driver.
 
Now what cooler currently on the market can handle 400+ watts? No water cooler from anyone I can think of can even contemplate the notion of trying to cool something this hot. I run a 13900k and the latest EK cooler, it don't matter, that thing just wants to live on thermal throttle. The only way I've found to keep it cool is to limit it's power draw. I lock it in at 225w and still use a Lasko blower fan to keep temps reasonable.

The only thing I can think of to handle 400+w is a compressor based phase change cooler (aka window air conditioner) but then you have all that condensation to overcome. Not to mention the increased power draw of the cooler itself. I mean, just how is anybody gonna cool this thing at 400+ watts?
 
  • Like
Reactions: bit_user
Now what cooler currently on the market can handle 400+ watts? No water cooler from anyone I can think of can even contemplate the notion of trying to cool something this hot. I run a 13900k and the latest EK cooler, it don't matter, that thing just wants to live on thermal throttle. The only way I've found to keep it cool is to limit it's power draw. I lock it in at 225w and still use a Lasko blower fan to keep temps reasonable.

The only thing I can think of to handle 400+w is a compressor based phase change cooler (aka window air conditioner) but then you have all that condensation to overcome. Not to mention the increased power draw of the cooler itself. I mean, just how is anybody gonna cool this thing at 400+ watts?
You do not, and it just thermal throttles on sustained tasks/bugged spikes, and yes, that is not very stable. Raptor Lake Refresh already has the new voltage management technology, but it is rather conservative, a small manual undervolt should keep it on 350W without performance lose and improve stability by reducing thermal throttling.

Additionally, water cooler loops are rather conservative in their specs since each person installs their cooler with a varying degree of pressure, so they just assume the most careful builders to avoid warranty more than necessary. Intel CPUs aren't really that delicate, so you can just press the cooler into the CPU quite more, and with that + a good paste, assuming your case is moving air properly, you can actually prevent thermal throttling.

Of course, a motherboard with powerful auxiliary cooling helps, too, but more importantly one ready for Raptor Lake Refresh as older LGA1700 ones have a buckling issue that bends the CPU, messing up contact with the cooler, though you can also buy a custom buckle replacement (voiding warranty, yes) for older ones.
 
Last edited:
It's true that AMD clocked Zen 4 too far above its efficiency window. That can be fixed by restricting its boost clocks, a little more.


Dude, you're cracking me up, here!
🤣

efficiency-singlethread.png

If you want to see Zen 4 demonstrate good efficiency, a multithreaded workload + 65/88 W power limit will force it into its peak efficiency window. Just look at the 7900 strut its stuff!
efficiency-multithread.png
This whole debate is caused by a fundamental misunderstanding of how to measure efficiency. I hope we all know voltage and clockspeeds don't scale linearly, therefore the CPU that runs at lower power will have a fundamental advantage. That's why the entire T lineup of Intel cpus are topping the efficiency charts. There is not a single amd CPU in the top 10 efficiency chart, all the spots are taken by Intel with their 35w parts

And that's why you test efficiency at ISO wattage. The 14900ks is going to be the most efficient Intel cpu ever released, but people are crying about how inefficient it is cause they remove power limits and let it rip at 500 watts. Why would you do that if you actually care about efficiency is beyond me. I guess cause you want to complain maybe?

ISO wattage testing is all that matters, and when you do that Intel cpus in general are much more efficient than their AMD counterparts. The only exception is the 7950x / 7950x 3d which is 5-7% more efficient than the 14900k. In every other segment, it's no bueno.
 
Im going to buy it just because of how insanely efficient it will be. Downclock it to 5.5 ghz and it will require 100 watts less than my 13900k. That's insane.
All these charts above are not measuring "power" efficiency, they are only trying an attempt to measure work done per watt. To measure actual power efficiency one must account for the thermal byproduct due to resistance in circuit. If this chip could do 100% of it's designed work while generating zero excess heat, then it would be 100% power efficient. But since it literally gets hot enough to boil water, it is far from a power efficient chip!

It's wasting a lot of wattage/energy in the form of heat.
 
All these charts above are not measuring "power" efficiency, they are only trying an attempt to measure work done per watt. To measure actual power efficiency one must account for the thermal byproduct due to resistance in circuit. If this chip could do 100% of it's designed work while generating zero excess heat, then it would be 100% power efficient. But since it literally gets hot enough to boil water, it is far from a power efficient chip!

It's wasting a lot of wattage/energy in the form of heat.
And im saying that work done per watt should be measured either at ISO work or ISO wattage. Any other metric is useless. We don't compare any other devices like that. In fan reviews, does the reviewer tap out the fans at 100% and then compare temperatures? I mean - they do - but then they follow up with the ACTUAL important test, which is at ISO dba.
 
And im saying that work done per watt should be measured either at ISO work or ISO wattage. Any other metric is useless. We don't compare any other devices like that. In fan reviews, does the reviewer tap out the fans at 100% and then compare temperatures? I mean - they do - but then they follow up with the ACTUAL important test, which is at ISO dba.
If one is to measure power efficiency on anything, this is how it's done. All PSU's since the dawn of time have been measured this way, so why would a PC fan be any different? There is of coarse more to it than just thermal waste, but the thermals are of extreme importance in any real or meaningful calculation of power efficiency.
 
All these charts above are not measuring "power" efficiency, they are only trying an attempt to measure work done per watt. To measure actual power efficiency one must account for the thermal byproduct due to resistance in circuit. If this chip could do 100% of it's designed work while generating zero excess heat, then it would be 100% power efficient. But since it literally gets hot enough to boil water, it is far from a power efficient chip!

It's wasting a lot of wattage/energy in the form of heat.
Where exactly do you see the difference?!
We do know the total power input and we do get a number for work done which is basically the useful output, divide total by useful and you get power efficiency.

Thermals are getting important for the cooling efficiency, different thing though, relevant but different.
And cooling efficiency is also much better on intel, even when drawing a lot more power does it stay a lot cooler.
https://www.anandtech.com/show/17641/lighter-touch-cpu-power-scaling-13900k-7950x/3
sNUwgPN.jpg
 
Where exactly do you see the difference?!
We do know the total power input and we do get a number for work done which is basically the useful output, divide total by useful and you get power efficiency.

Thermals are getting important for the cooling efficiency, different thing though, relevant but different.
And cooling efficiency is also much better on intel, even when drawing a lot more power does it stay a lot cooler.
https://www.anandtech.com/show/17641/lighter-touch-cpu-power-scaling-13900k-7950x/3
sNUwgPN.jpg
All these charts are only measuring how much power it takes to do "X" amount of work. They have nothing to do with power efficiency as they do not account for or consider thermal waste. They are 'not' "power efficiency" test results, they are only CPU performance per watt consumed, results. In other words, they are just more CPU benchmark scores that have nothing to do with actual power efficiency. That's all I'm saying.
 
Where exactly do you see the difference?!
We do know the total power input and we do get a number for work done which is basically the useful output, divide total by useful and you get power efficiency.

Thermals are getting important for the cooling efficiency, different thing though, relevant but different.
And cooling efficiency is also much better on intel, even when drawing a lot more power does it stay a lot cooler.
https://www.anandtech.com/show/17641/lighter-touch-cpu-power-scaling-13900k-7950x/3
sNUwgPN.jpg
Wow, peak power is crazy off for amd. What happened there? It's because the svid interface that controls power is on the mobo instead of the cpu like Intel? Those numbers are crazy off
 
Wow, peak power is crazy off for amd. What happened there? It's because the svid interface that controls power is on the mobo instead of the cpu like Intel? Those numbers are crazy off
AMD always had big peaks (both CPU and GPU) but their sustained average is a lot lower, the power efficiency (vs Intel) is absolutely dominated by AMD at the moment, but it is true you cannot cheap out in the power supply just because they spend less in average, since their peaks are high.
 
Last edited:
  • Like
Reactions: bit_user
All these charts are only measuring how much power it takes to do "X" amount of work. They have nothing to do with power efficiency as they do not account for or consider thermal waste. They are 'not' "power efficiency" test results, they are only CPU performance per watt consumed, results. In other words, they are just more CPU benchmark scores that have nothing to do with actual power efficiency. That's all I'm saying.
Again, where do you see the difference? You can measure the wasted energy by looking at "how hot it gets" how much energy gets radiated away, or you can look at how much of the energy is being converted to actual work.
The benches do the second thing.
Wow, peak power is crazy off for amd. What happened there? It's because the svid interface that controls power is on the mobo instead of the cpu like Intel? Those numbers are crazy off
It's because AMD allows for 30% more power than what the CPU will actually tell the rest of the system it is drawing (TDP) .
You have to measure the physical power straight from the CPU pins to get actual numbers.
https://gamersnexus.net/guides/3491-explaining-precision-boost-overdrive-benchmarks-auto-oc
Package Power Tracking (“PPT”): The PPT threshold is the allowed socket power consumption permitted across the voltage rails supplying the socket. Applications with high thread counts, and/or “heavy” threads, can encounter PPT limits that can be alleviated with a raised PPT limit.
  1. Default for Socket AM4 is at least 142W on motherboards rated for 105W TDP processors.
  2. Default for Socket AM4 is at least 88W on motherboards rated for 65W TDP processors.
 
AMD always had big peaks (both CPU and GPU) but their sustained average is a lot lower, the power efficiency (vs Intel) is absolutely dominated by AMD at the moment, but it is true you cannot cheap out in the power supply just because they spend less in average, since their peaks are high.
Only for people that have no idea about computers and those people will generally don't care about it.
If somebody cares they can set the power limit to 200W ,the average power draw over 47 applications is 141W if limited to 200W against the 128W of the 7950x , that's still a 10% difference in favor of AMD but then it's also just 13W and nobody will argue that 13W is going to make any kind of difference, and nobody in the whole world would call 13W "dominating" .

https://www.techpowerup.com/review/...ke-tested-at-power-limits-down-to-35-w/8.html
D5TipA9.jpg
 
the power efficiency (vs Intel) is absolutely dominated by AMD at the moment
That is not true? Like, at all?

Iso wattage in most segments intel has the lead in both performance and efficiency. R5 vs i5, r7 vs i7 etc. Only the 7950x / 7950x 3d has a small 5-7% lead over the 14900k. In the other segments it's not even close.
 
Again, where do you see the difference? You can measure the wasted energy by looking at "how hot it gets" how much energy gets radiated away, or you can look at how much of the energy is being converted to actual work.
The benches do the second thing.

A 400 watt CPU that wastes 200 watts in heat is only 50% efficient. You were talking about how efficient this chip is and using benchmarks to show your point, but it's an apple to orange comparison. "Power Efficiency" must take into account thermal waste. All these charts and graphs just keep showing how much work is done per watt consumed, they have nothing to do with efficiency and should not be titled as such.

Efficiency is not a few points in some arbitrary CPU benchmark, but rather a measurement of wasted energy in percentages. That is the difference.
 
A 400 watt CPU that wastes 200 watts in heat is only 50% efficient. You were talking about how efficient this chip is and using benchmarks to show your point, but it's an apple to orange comparison. "Power Efficiency" must take into account thermal waste. All these charts and graphs just keep showing how much work is done per watt consumed, they have nothing to do with efficiency and should not be titled as such.

Efficiency is not a few points in some arbitrary CPU benchmark, but rather a measurement of wasted energy in percentages. That is the difference.
A CPU turns all of it's power into heat, no? What are you talking about?
 
  • Like
Reactions: bit_user
Status
Not open for further replies.