News Intel Core i9-14900KS alleged benchmarks leaked — up to 6.20 GHz and 410W power draw

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
That's not it, read the review, he said he set PPT numbers in the bios.
He said that he did restrict PPT, but the fact that max power still went 30% above TDP so to the normal PPT levels shows that this setting did nothing.
Starting with the peak power figures, it's worth noting that AMD's figures can be wide off the mark even when restricting the Package Power Tracking (PPT) in the firmware. For example, restricting the socket and 7950X to 125 W yielded a measured power consumption that was still a whopping 33% higher.
 
A 400 watt CPU that wastes 200 watts in heat is only 50% efficient.
And these numbers are based on what?!
Ok you take the 400W number form the article, but where does the 200W number come from? Or is that just an example?

A CPU that draws 400W will dissipate that 400W, a CPU cannot store excess energy and the work done by the CPU will not transform watts into magic fairy dust it's still electrons moving around that are still using energy to do this moving around and that's still heat in the end.
 
This whole debate is caused by a fundamental misunderstanding of how to measure efficiency. I hope we all know voltage and clockspeeds don't scale linearly, therefore the CPU that runs at lower power will have a fundamental advantage.
...
And that's why you test efficiency at ISO wattage.
ComputerBase did that:

However, I took issue with how they presented their data. I think this gives a much clearer sense of what's going on.

cj1qY3F.png


Zen 4 doesn't scale up or down very well, but it's very efficient in its sweet spot.

ISO wattage testing is all that matters, and when you do that Intel cpus in general are much more efficient than their AMD counterparts.
That's only true when you push Zen 4 outside of its efficiency window, which is unfortunately something that happens with the X-series CPUs.

The only exception is the 7950x / 7950x 3d which is 5-7% more efficient than the 14900k.
The main reason for that is they had to dial back the boost clocks, in order to accommodate the 3D VCache die. This had the effect of keeping Zen 4 closer to its efficiency window. Sure, the extra cache helps, but I think you can get similar efficiency by taking a non-X3D equivalent and just limiting its clockspeeds in the same way.
 
Last edited:
  • Like
Reactions: Nyara and TheHerald
ComputerBase did that:

However, I took issue with how they presented their data. I think this gives a much clearer sense of what's going on.
cj1qY3F.png

Zen 4 doesn't scale up or down very well, but it's very efficient in its sweet spot.


That's only true when you push Zen 4 outside of its efficiency window, which is unfortunately something that happens with the X-series CPUs.


The main reason for that is they had to dial back the boost clocks, in order to accommodate the 3D VCache die.
That's a nice graph, haven't seen it before. Sadly it doesn't have 14th gen but it's okay.

So as I was saying, in most segments intel has the efficiency lead. We need to remember after all, the R5 is a competitor to the i5 13600, the r7 to the i7 etc. Sure amd dropped the prices which changed the segments a bit, but still the 7800x 3d for example is competing price wise with the 13700k / 14700k. And the lead intel has in that segments is absolutely humongous.

So, I don't get why or how people keep claiming otherwise. And that's not even taking into account very light loads like browsing the web where intel is like infinite times more efficient than amd. Side by side a 7950x peaks at 65 watts and averages around 40 just browsing the web, 14900k does that at below 10
 
  • Like
Reactions: bit_user
That's a nice graph, haven't seen it before. Sadly it doesn't have 14th gen but it's okay.
Thanks! It's nice to hear some positive feedback, since I probably spent a couple hours on it.
: )

I can only work with the data available to me. If you find similar data on Gen 14, please let me know!

If I were them, I'd be automating these plots. Shouldn't be hard for them, given what they already did with their interactive bar charts. That way, we could view each of their metrics in this way. The single metric I plotted is just the overall figure (average?).
 
  • Like
Reactions: TheHerald
Thanks! It's nice to hear some positive feedback, since I probably spent a couple hours on it.
: )

I can only work with the data available to me. If you find similar data on Gen 14, please let me know!

If I were them, I'd be automating these plots. Shouldn't be hard for them, given what they already did with their interactive bar charts. That way, we could view each of their metrics in this way. The single metric I plotted is just the overall figure (average?).
It's not the best data cause as far as I'm aware their graphs don't measure power draw but power limits. These should obviously match for heavy workloads but I think they are using some relatively light workloads in their test suite (agisoft photoscan?) and in those I assume power draw will be a lot lower than the power limit. Still it's the best data we currently have.
 
  • Like
Reactions: bit_user
It's not the best data cause as far as I'm aware their graphs don't measure power draw but power limits.
Yes, but for multithreaded workloads, my experience tells me those CPUs will have no trouble saturating most of the limits. It's only when the lower core-count CPUs get into the higher-limit territory that they might under-utilize it.

I think they are using some relatively light workloads in their test suite (agisoft photoscan?)
That's not a light workload!! It's a photogrammetric 2D -> 3D model generator!
 
  • Like
Reactions: TheHerald
Yes, but for multithreaded workloads, my experience tells me those CPUs will have no trouble saturating most of the limits. It's only when the lower core-count CPUs get into the higher-limit territory that they might under-utilize it.


That's not a light workload!! It's a photogrammetric 2D -> 3D model generator!
Ah, cool, lot of apps I don't know in their testing. I'll check that one out 😁
 
  • Like
Reactions: bit_user
Ah, cool, lot of apps I don't know in their testing. I'll check that one out 😁
Anandtech used to have it in their test suite. I don't know why they dropped it - maybe license fees were an issue or they simply wanted to streamline the testsuite?

Their last review it featured in was the i9-12900K, but the Rocket Lake review had more details about it:

They do mention part of it is single-threaded, which I didn't realize. I wonder if that's still true of the version ComputerBase used.
 
Anandtech used to have it in their test suite. I don't know why they dropped it - maybe license fees were an issue or they simply wanted to streamline the testsuite?

Their last review it featured in was the i9-12900K, but the Rocket Lake review had more details about it:
CPU-1-1-Photoscan.png

They do mention part of it is single-threaded, which I didn't realize. I wonder if that's still true of the version ComputerBase used.
It's sad that they stopped, they are very educational for people that don't understand how efficiency works. This thread is a prime example, everyone ditching the 14900ks when in fact it's going to be an efficiency monster.

Just for comparisons sake, the normal 14900k required 80 watts less than the 13900k for same clockspeeds. Locked both of them to 5.5 ghz, the 13900k was struggling and hitting the thermal limit at 330 watts, the 14900k was just chilling at 260-270 watts. The fact that you can push it to a gazillion watts isn't a negative, it's a positive. It's optional, you don't have or need to. The performance drop going from 400w all the way down to 125w is just about 12%. At a very reasonable 200w power limit it's around 7%.
 
... everyone ditching the 14900ks when in fact it's going to be an efficiency monster.

Just for comparisons sake, the normal 14900k required 80 watts less than the 13900k for same clockspeeds. Locked both of them to 5.5 ghz, the 13900k was struggling and hitting the thermal limit at 330 watts, the 14900k was just chilling at 260-270 watts.
I'd love to see some good sources on this, if you've got any to share.
 
I'd love to see some good sources on this, if you've got any to share.
There are some people that have videos between a 13900k and the 13900ks. I'm too bored to do the comparison right now since I have the 12900k installed on my system, the other cpus are sitting on the shelf 😁
 
And these numbers are based on what?!
Ok you take the 400W number form the article, but where does the 200W number come from? Or is that just an example?
The numbers were just for example to make it easy to see these charts are not measuring power efficiency but rather work done per watt.
 
If a CPU was 100% "power efficient" then it would generate no heat!
I don't know what you are missing or how else to spell this out.
You are majorly confused. Efficiency is the measure of work done per the energy spent.

In power supplies which you are referring to, the same applies. The efficiency of a psu is measured by work done (which is, how much power they managed to convert) divided by how much power they used for that wormk(that is how much they pulled from the plug).
 
  • Like
Reactions: bit_user
If a CPU was 100% "power efficient" then it would generate no heat!
I don't know what you are missing or how else to spell this out.
You are looking at it like an engineer looking at a car engine.
In a CPU the effective power and the waste power are not distinguishable, you can not measure them in different ways.
In an engine the output is the work performed plus the excess heat and they are extremely different and easily to measure separately, in a CPU they are both the same unit and you can only measure them all together.
 
  • Like
Reactions: Eximo
They are both efficiency. Two sides of the same coin.

Power consumed = heat output. What the performance per watt charts are measuring is the work done for the power consumed. A more efficient chip would do the same work with less watts. In terms of physics, all CPUs are the same. They are big resistors with 100% efficiency.
 
In terms of physics, all CPUs are the same. They are big resistors with 100% efficiency.
I think you mean they're 100% efficient at turning electrical energy into heat energy? If yes, then we're agreed.

An interesting aside: I once heard someone claim that CPUs' ability to destroy information is what makes them so inefficient. So, they designed (I'm not sure if this was purely hypothetical or if it was ever experimentally demonstrated) a CPU that would instead essentially act as a sorting machine, with the benefit being that it was supposedly much more efficient. I'm not sure how well that could work, in practice, since it seems like it would need an infinite amount of memory to use at scale.
 
I think you mean they're 100% efficient at turning electrical energy into heat energy? If yes, then we're agreed.

An interesting aside: I once heard someone claim that CPUs' ability to destroy information is what makes them so inefficient. So, they designed (I'm not sure if this was purely hypothetical or if it was ever experimentally demonstrated) a CPU that would instead essentially act as a sorting machine, with the benefit being that it was supposedly much more efficient. I'm not sure how well that could work, in practice, since it seems like it would need an infinite amount of memory to use at scale.

Yes. Pretty much all electronics and thermodynamics at its most simplistic.

Alternative wave functions being collapsed becoming the wasted energy? I think you can look at it like that. If an electron tries to go through a transistor and the observed result is the success, all other possibilities would be the failures. Which itself is represented by the overall resistance to electron flow. Not entirely certain how quantum electro dynamics resolves into classical physics. Just a little beyond my comprehension.
 
  • Like
Reactions: bit_user
You are looking at it like an engineer looking at a car engine.
In a CPU the effective power and the waste power are not distinguishable, you can not measure them in different ways.
In an engine the output is the work performed plus the excess heat and they are extremely different and easily to measure separately, in a CPU they are both the same unit and you can only measure them all together.
Yes, I think it's clear I'm looking at efficiency from a different point of view here. I see it more like an electrical motor or toaster than a car engine, but same difference I guess.

The CPU is an electrical switch or series of switches, it is not a resistor or a space heater even though my 13900K would definitely argue otherwise! It's primary function is to switch, not resist. Electrical resistance is part of every circuit and creates electrical losses because of it.

If 50% of the electrical input power is converted to heat within a CPU due to electrical resistance, then it is at best only 50% power efficient.

It's primary function is to switch. Any amount of energy spent or wasted on internal resistance will be a net summed loss. So for example: 400 watts consumed, 50% goes to actually switching and the other 50% is wasted on resistance or other electrical losses. That means out of 400 watts consumed, only 200 watts of that power goes into actual switching and the other 200 watts is wasted in the form of heat.

Evidence of this has already been proven in some of the charts listed on this thread. When the CPU does almost the same amount of work on 65 watts as it does on 250 (for example), then where do you think all that extra power is going? It's not going into work, but rather wasted heat energy. And that absolutely must be taken into account when talking about power efficiency.

You guys are talking about work done per watt, one CPU compared to another. I have acknowledged this. What I'm talking about is power wasted for work done per watt.

I hope this makes perfect sense now what I'm looking at, I really don't know how to spell this out any better.
 
Yes, I think it's clear I'm looking at efficiency from a different point of view here. I see it more like an electrical motor or toaster than a car engine, but same difference I guess.

The CPU is an electrical switch or series of switches, it is not a resistor or a space heater even though my 13900K would definitely argue otherwise! It's primary function is to switch, not resist. Electrical resistance is part of every circuit and creates electrical losses because of it.

If 50% of the electrical input power is converted to heat within a CPU due to electrical resistance, then it is at best only 50% power efficient.

It's primary function is to switch. Any amount of energy spent or wasted on internal resistance will be a net summed loss. So for example: 400 watts consumed, 50% goes to actually switching and the other 50% is wasted on resistance or other electrical losses. That means out of 400 watts consumed, only 200 watts of that power goes into actual switching and the other 200 watts is wasted in the form of heat.

Evidence of this has already been proven in some of the charts listed on this thread. When the CPU does almost the same amount of work on 65 watts as it does on 250 (for example), then where do you think all that extra power is going? It's not going into work, but rather wasted heat energy. And that absolutely must be taken into account when talking about power efficiency.

You guys are talking about work done per watt, one CPU compared to another. I have acknowledged this. What I'm talking about is power wasted for work done per watt.

I hope this makes perfect sense now what I'm looking at, I really don't know how to spell this out any better.
This makes sense but it's not measurable.
The closest you can get is by drawing an efficiency curve but that is still being created by doing work per watt at different power levels.
That's this part basically.
When the CPU does almost the same amount of work on 65 watts as it does on 250 (for example),
You cannot physically measure how much energy was used for switching or was waste, you have to look at performance per watt.
image-151.png
 
You cannot physically measure how much energy was used for switching or was waste, you have to look at performance per watt.
Well sure ya can, I mean I cannot, and maybe you cannot, but some manufacturer somewhere sure can measure thermal losses. It's not that hard, all one needs to do is watch for temperature change either per transistor or per package. Anything above ambient temperature is a thermal loss. It may not be linear, but it is certainly measurable.

So the question then becomes, at what point does more power not realistically equal more work?

To brute force a CPU to run 5% faster while consuming 50% more power is not a beneficial trad off, but it is a major loss in overall power efficiency.
 
Status
Not open for further replies.