News Intel Launches Arrow Lake Core Ultra 200S — big gains in productivity and power efficiency, but not in gaming

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
For some leaked MSI benchmarks, it seems to edge out the 9950X in R23, with power unlocked

https://cdn.videocardz.com/1/2024/10/CORE-ULTRA-200-ARROW-LAKE-1.jpg

https://cdn.videocardz.com/1/2024/10/CORE-ULTRA-200-ARROW-LAKE-4.jpg

While getting quite a bit cooler despite drinking some 303W CPU package.
I still remain skeptical that the core temps will be that much lower. I'm very eager to know where the temp sensor is for the package.

Also, why do the two bars differ for each CPU? Is one of them based on electrical measurement, while the other is CPU self-reporting?
 
  • Like
Reactions: thestryker
Dude, @YSCCC said those are power-unlocked measurements. Did Intel's slide deck say anything about that?
The slide I posted in the first page has the 285k being faster (although it's CBR24) than the 9950x in the whole power range up to 250w. According to the MSI slide it looks like the 285k will roughly match (or might even lose) to the 9950x at same power, since it's barely ahead at 300w+. Unless it has very terrible scaling, or CBR24 gives vastly different results, either Intel or MSI are wrong.
 
I still remain skeptical that the core temps will be that much lower. I'm very eager to know where the temp sensor is for the package.

Also, why do the two bars differ for each CPU? Is one of them based on electrical measurement, while the other is CPU self-reporting?
For temperature maybe it's the resistance or voltage being lower? it's like if it don't need north of 1.4v but say, 1.2v it could drop some 10-20C in normal cooler I guess

If my understanding is correct one is reported usage, the other is reading from the 12V power connector. either way I take the lower one supposedly not counting on the power loss in VRM stage.
 
  • Like
Reactions: thestryker
The slide I posted in the first page has the 285k being faster (although it's CBR24) than the 9950x in the whole power range up to 250w. According to the MSI slide it looks like the 285k will roughly match (or might even lose) to the 9950x at same power, since it's barely ahead at 300w+. Unless it has very terrible scaling, or CBR24 gives vastly different results, either Intel or MSI are wrong.
IIRC in De8auer's video or someone else which I can't recall definitively, the intel slides use the old gen and 9950X with much slower ram than they use on ARL, and they didn't specify what cooling of each are used, say if you put a box cooler of AM4 socket you will get a ton less boost clock vs a watercooled ARL just to say, that's why independant reviews will reveal the truth
 
  • Like
Reactions: bit_user
For temperature maybe it's the resistance or voltage being lower? it's like if it don't need north of 1.4v but say, 1.2v it could drop some 10-20C in normal cooler I guess

If my understanding is correct one is reported usage, the other is reading from the 12V power connector. either way I take the lower one supposedly not counting on the power loss in VRM stage.
No, voltage doesn't increase temps. It's power.


The temperatures could make sense since the Ecores are much larger, they consume a lot more power than previous ecores, which also means the Pcores draw a lot less (especially since HT is also removed), therefore the heat is distributed a lot better - equally across the whole die. In ALD and RPL it was quite common for the ecores to be sitting at 70C while the Pcores went up to 90+, now it looks like a much more even spread were all cores will be running at ~80.
 
The temperatures could make sense since the Ecores are much larger, they consume a lot more power than previous ecores, which also means the Pcores draw a lot less (especially since HT is also removed),
The Skymont E-cores appear to be roughly 33.2% as big as the Lion Cove P-cores, in Lunar Lake (excl. L2). With Alder Lake, the E-cores were about 29.6% as big as Golden Cove (excl. L2).

If you include L2, the ratio goes 38.1%. With Alder Lake, the ratio is 28.8% E-core to P-core, including (amortized) L2. I didn't find die measurements for Raptor Lake, but I did some back-of-the-envelope math to estimate the ratio at 31.7% (based on estimated E-core cluster size of 10.8 mm^2 and estimated Raptor Cove size of 8.52 mm^2).

The point is that they're bigger but it's not a night-and-day difference and probably not enough to counteract the effect of the node shrink. I expect both core types will be smaller on the new node, in absolute terms.

therefore the heat is distributed a lot better - equally across the whole die.
It's not the whole die, though. All of the cores are packed together in a chiplet situated at one edge of the package.
 
The Skymont E-cores appear to be roughly 33.2% as big as the Lion Cove P-cores, in Lunar Lake (excl. L2). With Alder Lake, the E-cores were about 29.6% as big as Golden Cove (excl. L2).

If you include L2, the ratio goes 38.1%. With Alder Lake, the ratio is 28.8% E-core to P-core, including (amortized) L2. I didn't find die measurements for Raptor Lake, but the sizes with L2 would definitely push the ratio higher.

The point is that they're bigger but it's not a night-and-day difference and probably not enough to counteract the effect of the node shrink. I expect both core types will be smaller on the new node, in absolute terms.


It's not the whole die, though. All of the cores are packed together in a chiplet situated at one edge of the package.
Well my point is, if you are running ald at 250w, 50w of that are used by the ecores and 200w by the pcores, so those pcores will be hitting 90+ while ecores will be chilling at 70. On the 285k I can easily see the distribution being 125 for e and 125 for p cores or something similar

You have to take into account that alderlake ecores were also running a lot lower clockspeeds.
 
IIRC in De8auer's video or someone else which I can't recall definitively, the intel slides use the old gen and 9950X with much slower ram than they use on ARL, and they didn't specify what cooling of each are used, say if you put a box cooler of AM4 socket you will get a ton less boost clock vs a watercooled ARL just to say, that's why independant reviews will reveal the truth
Different speed (5600 v 6400) due to using officially rated speeds for all, but equal latency memory:
https://edc.intel.com/content/www/us/en/products/performance/benchmarks/desktop_2/

Steve from HUB mentioned that due to all of the memory being low latency he didn't expect there to be more than maybe a 1% performance difference overall versus using DDR5-6000 CL30 as AMD recommends.

They don't list the cooler used in the testing, but there's a slide talking about gaming temperatures (ARL v RPL) which refers to a 360mm AIO. I can't imagine they would have purposely held the AMD comparison back here given how AMD got raked over the coals due to their misleading/false marketing slides. The guys from HUB also mentioned that during the press talk they got Intel mentioned they expect reviewers to see similar results to theirs.

These Intel marketing slides really just strike me as the company attempting to be more transparent due to the market circumstances as opposed to doing the super obfuscated/cheerleading they normally would.
 
Well my point is, if you are running ald at 250w, 50w of that are used by the ecores and 200w by the pcores, so those pcores will be hitting 90+ while ecores will be chilling at 70. On the 285k I can easily see the distribution being 125 for e and 125 for p cores or something similar
I understand your conjecture. I'm not directly refuting it - just trying to inject more facts into the discussion. Part of the reason I even posted that was just to motivate myself to work through the numbers. I think it's insightful to have this data.

I don't really care what anyone says, since this is speculation piled on top of leaks. I will wait until I see some independent measurements.
 
Intel was always transparent with their slides. Actually, reviewers found better numbers than intels slides. For example intels slide for the 13900k had it on par with the 5800x 3d in gaming. 3rd party reviews found the 900k to be up to 20% faster (TPU, hub).

In the current slides it also seems like Intel isn't showing their superior performance in content creation to their igpu codecs being support. They have the 285k on par with the 7950X3D which I'm pretty sure will not be anywhere near in Adobes suite.
 
I understand your conjecture. I'm not directly refuting it - just trying to inject more facts into the discussion.

I don't really care what anyone says, since this is speculation piled on top of leaks. I will wait until I see some independent measurements.
To be frank none of these matters anyways. Temps were an issue on rpl only for reviewrs like hub that went into the bios, removed all power and Amp limits and then looped blender and cinebench. For normal people temps are not an issue. Nobody is running blender professionally at 350 watts.
 
Different speed (5600 v 6400) due to using officially rated speeds for all, but equal latency memory:
https://edc.intel.com/content/www/us/en/products/performance/benchmarks/desktop_2/

Steve from HUB mentioned that due to all of the memory being low latency he didn't expect there to be more than maybe a 1% performance difference overall versus using DDR5-6000 CL30 as AMD recommends.

They don't list the cooler used in the testing, but there's a slide talking about gaming temperatures (ARL v RPL) which refers to a 360mm AIO. I can't imagine they would have purposely held the AMD comparison back here given how AMD got raked over the coals due to their misleading/false marketing slides. The guys from HUB also mentioned that during the press talk they got Intel mentioned they expect reviewers to see similar results to theirs.

These Intel marketing slides really just strike me as the company attempting to be more transparent due to the market circumstances as opposed to doing the super obfuscated/cheerleading they normally would.
These are all speculations before any public reviews are out, and we will know how transparent or cherry picked the slides are, but TBF, the MSI slide only showed R23, and for R23 alone, I've tried my own system of 14900k, with same undervolting and setting the PL2 to 270W instead of 253 to let the all core frequencies at R23 staying at 5.4Ghz, only changing the ram from DDR5 6000 cl 32-38-38-96 to 30-38-38-96 goes from 39300 to 39900 points in 5 runs average, using a 5600 40 kit goes to the mid 38k range, so ram could do quite a bit of difference in R23 alone.

One positive thing in the internet reviews are we could see how different setups would perform in a variety of use cases, but the MSI one stands quite interesting as they use the unlimited power OC settings to present, and I can't see why in their new product press they would want to paint Intel in a bad brush, I'd say let the reviews and tim tell the story.
 
  • Like
Reactions: bit_user
Just a reminder: the temperature a CPU/GPU reports is the result of how AMD/Intel/nVidia decide to present it and not how it really is. It is nearly impossible to measure temperature directly, so they just do a best guess. This is what AMD did from Zen4 to 5, mainly.

Regards.
 
Just a reminder: the temperature a CPU/GPU reports is the result of how AMD/Intel/nVidia decide to present it and not how it really is. It is nearly impossible to measure temperature directly, so they just do a best guess. This is what AMD did from Zen4 to 5, mainly.
As far as what the CPU self-reports? Sure. Many motherboards have included a temperature sensor underneath the CPU, so it's not as if there's no way to get any visibility. There will be some thermal gradient between it and the onboard package temperature sensor, but I'd expect the ratio between them wouldn't change much, from one generation to the next. If the internal temperature of a new CPU suddenly said it was a lot less hot than you'd expect from the motherboard's probe, then you'd have to be suspicious they started fudging the reported data.

Also, if the CPU started throttling at a lower temperature, that would be further evidence that the reported temperatures are fudged. Interestingly, AMD increased the throttling threshold between Zen 4 and Zen 5.

Finally, one could get a read on the IHS temperature by cutting a channel into the base of a heatsink or waterblock and embedding a temperature probe in it. Shouldn't be that hard.

FWIW, I think temperatures don't matter so much, in the absolute sense. What really counts is power consumption and throttling. Overall heat output is going to equal power consumption. Throttling is the thing that actually limits performance. Temperatures are just a metric we use to try and optimize cooling setups. To that end, where they really count is comparing temperatures from the same CPU, in which case all you care about is relative accuracy between two different the measurements.
 
"Intel says its new flagship Core Ultra 9 285K matches AMD’s flagship Ryen 9 9950X". I didn't know Intel had a time-machine. 9950X hasn't been released yet. There is no way Intel has this CPU to test against. Let's just wait for benchmarks, shall we?
 
As far as what the CPU self-reports? Sure. Many motherboards have included a temperature sensor underneath the CPU, so it's not as if there's no way to get any visibility. There will be some thermal gradient between it and the onboard package temperature sensor, but I'd expect the ratio between them wouldn't change much, from one generation to the next. If the internal temperature of a new CPU suddenly said it was a lot less hot than you'd expect from the motherboard's probe, then you'd have to be suspicious they started fudging the reported data.

Also, if the CPU started throttling at a lower temperature, that would be further evidence that the reported temperatures are fudged. Interestingly, AMD increased the throttling threshold between Zen 4 and Zen 5.

Finally, one could get a read on the IHS temperature by cutting a channel into the base of a heatsink or waterblock and embedding a temperature probe in it. Shouldn't be that hard.

FWIW, I think temperatures don't matter so much, in the absolute sense. What really counts is power consumption and throttling. Overall heat output is going to equal power consumption. Throttling is the thing that actually limits performance. Temperatures are just a metric we use to try and optimize cooling setups. To that end, where they really count is comparing temperatures from the same CPU, in which case all you care about is relative accuracy between two different the measurements.
Exactly, thermal throttling is what matters. Reported temps are meaningless
 
Exactly, thermal throttling is what matters. Reported temps are meaningless
Technically and figuratively, that's not correct.

They are very meaningful because there's plenty of things which are decided based on what is reported. Case in point: throttling point is decided on how it is reported, unless Intel, AMD and nVidia have an alternative way of deciding that which is not exposed via motherboard controls.

Also, I just dislike seeing my CPU running at 95°C, no matter how "safe" any of them tells me "it's fine". That's on me, but it is relevant on its own right. Maybe they all need to be better at communicating expectations going forward given the 30+ year history of how (somewhat modern) CPUs have behaved.

Regards.
 
Technically and figuratively, that's not correct.

They are very meaningful because there's plenty of things which are decided based on what is reported. Case in point: throttling point is decided on how it is reported, unless Intel, AMD and nVidia have an alternative way of deciding that which is not exposed via motherboard controls.

Also, I just dislike seeing my CPU running at 95°C, no matter how "safe" any of them tells me "it's fine". That's on me, but it is relevant on its own right. Maybe they all need to be better at communicating expectations going forward given the 30+ year history of how (somewhat modern) CPUs have behaved.

Regards.
We are not saying anything differently. If the CPU thermal throttles based on reported temps then im sure both AMD and intel will make sure that reported temps are correct, in which case the temps themselves don't matter, the thermal throttling is the issue. If the CPU doesn't thermal throttle based on reported temps but some other hidden sensor then reported temps don't matter anyways.


I don't get the obsession over temps people have. Even the hilariously bad at thermal transfer AMD chips run relatively cool if you just limit them to normal power draws. A 7950x running at 150w-180w will run cool like a cucumber at like 70C even at heavy workloads. It's not a sin to power limit a chip, just because companies decide to push 250+ watts just to sell you an overpriced mobo and cooler doesn't mean you have to stick to that.
 
We are not saying anything differently. If the CPU thermal throttles based on reported temps then im sure both AMD and intel will make sure that reported temps are correct, in which case the temps themselves don't matter, the thermal throttling is the issue. If the CPU doesn't thermal throttle based on reported temps but some other hidden sensor then reported temps don't matter anyways.


I don't get the obsession over temps people have. Even the hilariously bad at thermal transfer AMD chips run relatively cool if you just limit them to normal power draws. A 7950x running at 150w-180w will run cool like a cucumber at like 70C even at heavy workloads. It's not a sin to power limit a chip, just because companies decide to push 250+ watts just to sell you an overpriced mobo and cooler doesn't mean you have to stick to that.
Probably this hinges on a technicality, but CPU throttling is akin to "CPU bad, CPU struggling". A CPU that is throttling, means it needs more cooling (extract the heat from it quicker, if you wanna go there). And, more to this point, accurate temperature readings allow you to optimize better the voltage curve of a CPU, since temperatures do have an impact on conductivity at the higher range of the speeds they're meant to work. See XOC with LN2.

As for the obsession. Yes, you can call it that. I call it "conditioning".

Regards.