News Intel Launches Arrow Lake Core Ultra 200S — big gains in productivity and power efficiency, but not in gaming

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
So, just for comparison's sake. How much does the 13900/14900 K(S) siblings use in it? Also, where are the testing conditions of that benchmark pass?
What difference does it make? Im not comparing, im just stating facts. The 13900k was hitting 180w in that game. The 14900ks I have no idea but wouldn't surprise me if it went above 250, probably more.

It's also interesting that TPU has CP2077 using 85W for the 7950X and not 150W. So there's contradictory data, where I'll take TPU's over that video.

Now, to be fair to you, the 9950X actually uses more power and it is indeed 100W average where CP2077 consumes ~115W.

https://www.techpowerup.com/review/amd-ryzen-9-9950x/23.html

Regards.
You are taking a graph made on paint over actual in game footage.
 
  • Like
Reactions: rluker5
I didn't say that the 14900k doesn't go above 100w in game at all if you understand ENGLISH, I said IME, CPU bottlenecked MSFS consumes ~100W (which means, it goes 9x-10x fluctuating), and that is the question of it being impossible to be reduction of 160W, your reading is as flawed as always
But they are not showing a 165w reduction in MSFS, so...?
 
But they are not showing a 165w reduction in MSFS, so...?
Well, that was a QUERY and a legit one, in a game that I actually played and logged the power usage and temp, running all P cores 5.7Ghz and all E cores 4.4Ghz I saw it consumes ~100W, so as any one with any acceptable IQ knows that no wonder CPU can use close to 0W and do a job as a recent chip using 100W, there is very close to zero chance that it can have the 165w reduction, especially that the 14900K was capped at 253W, so? why that isn't valid? even sub the 100W with your the last of us figure at 200W, the question is still valid
 
Ar you asking this seriously? As I asked: where is the testing conditions? For all we know, that 7950X was using PBO or some heavy OC underneath? I tried to find the specifics, but could not find them.

Hell, you yourself quote TPU plenty. Double standard much?

Regards.
Im quoting TPU cause people - just like you demonstrated - trust it. TPU has lots of errors in lots of their data. Big errors. Like, as big as 70% errors.

Yes im asking seriously. Why would you trust a graph made on paint over live footage? TPU is testing a completely different and lightweight area - apparent by his framerate. Guy on the video is testing Tom's dinner, literally the heaviest area of the game. So power draw differences make sense.
 
Well, that was a QUERY and a legit one, in a game that I actually played and logged the power usage and temp, running all P cores 5.7Ghz and all E cores 4.4Ghz I saw it consumes ~100W, so as any one with any acceptable IQ knows that no wonder CPU can use close to 0W and do a job as a recent chip using 100W, there is very close to zero chance that it can have the 165w reduction, especially that the 14900K was capped at 253W, so? why that isn't valid? even sub the 100W with your the last of us figure at 200W, the question is still valid
The 165w reduction is supposedly in the new warhammer game. They might have measured it during shader compilation cause yeah, in game it doesn't make sense to go from 200 to 35.
 
How can a CPU offer 165w lower system power in gaming? My 8700k/5900x/7700x chips have never used anywhere anywhere close to 100w in gaming. Do modern Intel chips use hundreds of watts in gaming?
They can depending on the game, and Space Marine 2 had some absolutely brutal CPU usage when it launched (don't know if it still does).

From TPU's 14900K review:
power-per-game.png

https://www.techpowerup.com/review/intel-core-i9-14900k/22.html
 
  • Like
Reactions: adbatista
Fanless is a nice idea, but expensive to really do it right. I tried to build a fanless Alder Lake-N PC, but I quickly relented and had to incorporate a 140 mm fan because I hated the idea of it bumping along at 95C and thermal throttling, which is exactly what happened. That's just with a tiny 12 W / 25 W SoC, too. A vertical case and a big copper heatsink wasn't enough for passive cooling.

For DIY, my advice is simply to go for whisper quiet. To get a good fanless PC, you either need to be a resourceful modder or pay the big bucks for a specialty case or prebuilt.
I've been running a nice whisper-quiet box for like omg ten years now, but when it goes turbo or when the graphics go turbo secondary fans turn on, or something.

But Intel promised fanless just as soon as they released their 10nm, and I'm still waiting!

Thanks for the real-life report, but these new chips, especially the mobile, are supposed to be so cool, right?

Most of my use is really low power, when I hear the fans I know something is wrong, LOL.
 
  • Like
Reactions: bit_user
I've been running a nice whisper-quiet box for like omg ten years now, but when it goes turbo or when the graphics go turbo secondary fans turn on, or something.

But Intel promised fanless just as soon as they released their 10nm, and I'm still waiting!

Thanks for the real-life report, but these new chips, especially the mobile, are supposed to be so cool, right?

Most of my use is really low power, when I hear the fans I know something is wrong, LOL.
Tbh unless you use it in a big case or open air with the big noctua fabless heat sink, the low power mobile chips likely can’t be real fan less IMO, the heat (powe) density of modern chips are so high that the days of 8086 bare chip is long gone
 
  • Like
Reactions: JRStern
I've been running a nice whisper-quiet box for like omg ten years now, but when it goes turbo or when the graphics go turbo secondary fans turn on, or something.

But Intel promised fanless just as soon as they released their 10nm, and I'm still waiting!

Thanks for the real-life report, but these new chips, especially the mobile, are supposed to be so cool, right?

Most of my use is really low power, when I hear the fans I know something is wrong, LOL.
If you get a case from Streacom/Akasa you can do passive to a decent degree (45W/65W/95W are all on the table pretty easily) but as @bit_user said you have to pay a premium price so you'd likely be looking at $150-400 just for the case. It's never going to be a good fiscal choice, but certainly one which fits the passive niche. Most of the minipcs are based on mobile platforms, but you still have to keep the maximum power consumption under consideration as mobile parts can easily get up to around 100W for AMD and well over 100W for Intel.
 
Jesus H Christ! How do they make a core with 18 ALUs/execution ports and it can’t outperform Zen5 with 8 ALUs/execution ports? Every single structure is bigger than Zen5 with more storage too. What a flop.
 
Jesus H Christ! How do they make a core with 18 ALUs/execution ports and it can’t outperform Zen5 with 8 ALUs/execution ports? Every single structure is bigger than Zen5 with more storage too. What a flop.
Since at least Tremont (introduced 5 years ago), the E-cores have all had far more issue ports than the decoder can sustain. I think it just worked out that splitting them ended up being cheap, so they probably did it to avoid bottlenecking if the instruction mix was too biased towards a certain subset of instructions.

Regarding Skymont:

To execute instructions, Skymont gets a massive 26 execution ports. Intel explains that by saying that dedicated functionality on each port is better for energy efficiency. ... As for why 8 ALUs, Stephen said “It was cheap to add in terms of area and it helps with the peaks.” The peaks refers to when you have code that has a lot of instructions to crunch through.

Source: https://chipsandcheese.com/p/intel-details-skymont
 
Allegedly there will be, but it won't be a big uplift (Think RPL to RPL-R, not ADL to RPL)
I doubt there’s any difference for this, it is supposed to be meteor lake as 14th gen and arrow lake as 15th, unless they have big issues in the next gen where it is not competitive again it’s likely the 1851 platform will be ended.
 
Since at least Tremont (introduced 5 years ago), the E-cores have all had far more issue ports than the decoder can sustain. I think it just worked out that splitting them ended up being cheap, so they probably did it to avoid bottlenecking if the instruction mix was too biased towards a certain subset of instructions.
Tremont%20-%20Stephen%20Robinson%20-%20Linley%20-%20Final-page-003.jpg

Regarding Skymont:
To execute instructions, Skymont gets a massive 26 execution ports. Intel explains that by saying that dedicated functionality on each port is better for energy efficiency. ... As for why 8 ALUs, Stephen said “It was cheap to add in terms of area and it helps with the peaks.” The peaks refers to when you have code that has a lot of instructions to crunch through.​
I’ve read all the chipsandcheese stuff. I wish he’d made it more clear just how many discrete ALUs Lion Cove has per core, instead of just the number of execution ports.