News Intel Launches Arrow Lake Core Ultra 200S — big gains in productivity and power efficiency, but not in gaming

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Anyways, if the slides are true now Intel will be more efficient across the whole segment at iso power. They were already heavily leading in the i5 and i7 range but now it seems they will have the lead on the high end as well. Chadmont and tsmc seem incredible.
 
What I am hearing as a mixed use prosumer who happens to game some is my 7950X3D is in a great position to keep me happy til post socket 1851 (assuming one and done as rumored) /zen 6. At the bare minimum. My wallet likes this but not so much for the build geek in me. I just hope we're not in the middle another Sandy Bridge lull where my cpu platform is good enough for a decade and change of use. I guess we'll see just how dead Moore's Law really is.
 
Me. I want a fanless, silent workstation. Might even go for one of them tiny boxes.
Fanless is a nice idea, but expensive to really do it right. I tried to build a fanless Alder Lake-N PC, but I quickly relented and had to incorporate a 140 mm fan because I hated the idea of it bumping along at 95C and thermal throttling, which is exactly what happened. That's just with a tiny 12 W / 25 W SoC, too. A vertical case and a big copper heatsink wasn't enough for passive cooling.

For DIY, my advice is simply to go for whisper quiet. To get a good fanless PC, you either need to be a resourceful modder or pay the big bucks for a specialty case or prebuilt.
 
  • Like
Reactions: JRStern
This is one freaking bold graph from Intel.

asdf.png



I find it hard to believe, but I also find it hard to believe that they'd make this up since it's very specific and easily falsified.
Like Mattzun said this is going to be a big improvement for OEMs. Particularly business OEMs. For gamers who build their own then overclock, we will have to wait for reviews. Particularly those that reduce system latency.
Meteor Lake-S was supposed to be the first generation to use this socket. That got cancelled and instead we got Raptor Lake Refresh.


What happened before was the Broadwell desktop CPUs were pretty much cancelled. However, that was supposed to be the second generation in LGA1150. So, they released a "Haswell Refresh" that also used that socket, but it was basically the same CPU.

In this case, it would be like if Haswell were the one to get cancelled and then the Broadwell Desktop CPU ended up being the first LGA 1150. But, then Skylake was already planned to use LGA1151 and they just stuck with that plan, leaving LGA1150 as a single-gen socket.
Broadwell did come out a year after Devils Canyon and just a quarter before Skylake, but it seemed to have fallen victim to reviewers who only reported it's productivity, efficiency and iGPU performance and neglected to report it's dGPU performance.
It was the archetype for the X3D and it took the 8700k and the third Ryzen gen to beat it in gaming. Imagine if no reviews mentioned the X3D's dGPU gaming performance and focused on everything else. That is exactly what happened to Broadwell. Many early X3D gaming reviews even took many the same games that favored Broadwell ( https://www.anandtech.com/show/1619...ective-review-in-2020-is-edram-still-worth-it ) to show how well the X3D was doing.

Broadwells were out there but reviewers told everyone the iGPU and edram weren't worth the extra $30 or so unless you weren't planning on getting a dGPU. I also have one running in my office, and while it is considerably faster than my daughter's 4770k in games it isn't nearly as fast as modern CPUs.

Edit: For mobile and OEMs (LNL and ARL) this looks like a very good generational increase from Intel. Among the best ever at reasonable power levels. Not the greatest for the vastly smaller DIY who don't overclock segment. So I expect the reviews and discussions to focus nearly exclusively on that worst case scenario while the 95% will forget that their old CPU took twice as long to do the same things.

If Arrow Lake came on the LGA 1700 socket I would probably get one for my Z690 Prime P to replace my 13900kf. But it doesn't so I will wind up saving $600 on no noticeable improvement on my 4k gaming. Thank you Intel for needing a new socket.
 
Last edited:
How can a CPU offer 165w lower system power in gaming? My 8700k/5900x/7700x chips have never used anywhere anywhere close to 100w in gaming. Do modern Intel chips use hundreds of watts in gaming?
 
Haswell was still 2 generations. You can argue it was a refresh but that's generally my point, that no matter what get's canceled and what doesn't it's highly unlikely we will only see one generation on the socket.


According to anandtech's review 4790k was ~12-14% faster than 4770k. That's a bigger jump than..uhm. oh well, zen 4 to zen 5 😱
Well at least it didn’t lose performance like Arrow Lake😱
 
  • Like
Reactions: YSCCC
How can a CPU offer 165w lower system power in gaming? My 8700k/5900x/7700x chips have never used anywhere anywhere close to 100w in gaming. Do modern Intel chips use hundreds of watts in gaming?
Good question... my 14900k during flight simulation (CPU bottleneck), consumes ~100W according to HWinfo.. only run to 253W limit during AVX workload like CB R23... initial impression was that the GPU eats up most of the power consumption, but wait, if the GPU worked as hard as it is in both setup, how can you get below the power draw of the comparison CPU...
 
I think the Z890 platform is more interesting than ArrowLake as a CPU; their level of connectivity is amazing and AMD needs to catch up on that front. Also, the efficiency gains are good, but I don't think they'll be enough to match AMD on all aspects. Not having HT implemented will definitely hurt, but it's definitely an intended trade off.

And don't worry. This will be a "2 generation" (at least) platform: they'll just add a 3 to the numbering, so you'll get a ICU9-385K and maybe a 365K with 2 more enabled E cores that completely justify the new numbering. No snake oil at all and totally different CPUs. Totally.

Regards.
 
How can a CPU offer 165w lower system power in gaming? My 8700k/5900x/7700x chips have never used anywhere anywhere close to 100w in gaming. Do modern Intel chips use hundreds of watts in gaming?
Reviewers use unrealistic scenarios like 720p gaming on a 4090 with uncapped framerates to report gaming power consumption. Normal gaming like what we do is unrepresentative of that.
Sometimes when I want to consume less power and lose no performance in my 60fps 4k gaming (I also just have a 3080) I underclock via Windows power plan options and consume 20-40w on my 13900kf.
But Arrow Lake will be helpful for those who game on PCs they bought at Best Buy or Walmart as those will likely perform better out of the box than anything else.
 
Reviewers use unrealistic scenarios like 720p gaming on a 4090 with uncapped framerates to report gaming power consumption. Normal gaming like what we do is unrepresentative of that.
Sometimes when I want to consume less power and lose no performance in my 60fps 4k gaming (I also just have a 3080) I underclock via Windows power plan options and consume 20-40w on my 13900kf.
But Arrow Lake will be helpful for those who game on PCs they bought at Best Buy or Walmart as those will likely perform better out of the box than anything else.
I think that was not the question of @Pierce2623, if I didn't misunderstand what he meant, was that even with unlimited RPL, in gaming and not AVX workload, the CPU didn't come close to 165W power consumption altogether, so when one claim 165w reduced power consumption... it was like impossible?
 
@PaulAlcorn I suppose there is an error in the sentence : "In the same slide, Intel points to its strong performance advantage over the 9950X3D in a range of heavily-threaded productivity workloads."
Should be 7950X3D or 9950X I suppose.
 
Well at least it didn’t lose performance like Arrow Lake😱
Arrow lake didn't lose performance. At 125w it's as fast as the 14900k at 250w according to the slides.
How can a CPU offer 165w lower system power in gaming? My 8700k/5900x/7700x chips have never used anywhere anywhere close to 100w in gaming. Do modern Intel chips use hundreds of watts in gaming?
Modern AMD and Intel big chips consume over 100 watts in plenty of games provided you have a fast enough GPU.
 
Good question... my 14900k during flight simulation (CPU bottleneck), consumes ~100W according to HWinfo.. only run to 253W limit during AVX workload like CB R23... initial impression was that the GPU eats up most of the power consumption, but wait, if the GPU worked as hard as it is in both setup, how can you get below the power draw of the comparison CPU...
Try other games? Last of us casually hits 200 watts at stock.
 
False.

The 7800X3D uses under 60W.

Regards.
7800x 3d is not a big CPU. It's just an 8 core chip. The 7950x can casually hit 150w in heavy games.

So what kind of an argument is that? That's like me saying intel cpus cant' hit 200w in games cause me i3 12100 sits at 30 watts. Nonsense
 
I find extremely curious that after decades of separate development, at the end, two companies have obtained two almost identical CPUs by overall performance and with so little difference in power usage.
For sure the use of the same node is an important factor, nevertheless it is a curious coincidence.
 
7800x 3d is not a big CPU. It's just an 8 core chip. The 7950x can casually hit 150w in heavy games.

So what kind of an argument is that? That's like me saying intel cpus cant' hit 200w in games cause me i3 12100 sits at 30 watts. Nonsense
https://www.techpowerup.com/review/amd-ryzen-9-7950x/24.html

https://tpucdn.com/review/amd-ryzen-9-7950x/images/power-games.png

And I am using the 7950X, because, from memory, it uses slightly more power than the 9950X.

Regards.
 
I find extremely curious that after decades of separate development, at the end, two companies have obtained two almost identical CPUs by overall performance and with so little difference in power usage.
For sure the use of the same node is an important factor, nevertheless it is a curious coincidence.
Nah, not really. It's called a duopoly. They are both releasing the worst possible products that can get some sales. They know where each other performance is gonna land roundabout.
 
  • Like
Reactions: NinoPino
Try other games? Last of us casually hits 200 watts at stock.
So, try some basic maths, assume RPL keep hitting 253W limit, and 253-160= 97w, do you think in the same demanding game it is possible at all? for system power one way it could be true I can think of is the ARL actually bottlenecked the GPU, so it didn't drink as much power... but well, that would be worse than it's a fasle figure.
 
So, try some basic maths, assume RPL keep hitting 253W limit, and 253-160= 97w, do you think in the same demanding game it is possible at all?
I've already said in a previous comment that I don't think the 165w reduction is possible and that their graph is messed up. What I'm disagreeing with is your argument that the 14900k doesn't go above 100w in games, which is absolutely not the case. It can very easily - I've done it - hit 200w at stock.
 
Here you go, live gameplay, when he drops the resolution to 1080p boom, hits 150w easily. Man just stop, you are wrong, move on, it's fine.

View: https://www.youtube.com/watch?v=PgnoVz3ufj8
So, just for comparison's sake. How much does the 13900/14900 K(S) siblings use in it? Also, where are the testing conditions of that benchmark pass?

It's also interesting that TPU has CP2077 using 85W for the 7950X and not 150W. So there's contradictory data, where I'll take TPU's over that video.

Now, to be fair to you, the 9950X actually uses more power and it is indeed 100W average where CP2077 consumes ~115W.

https://www.techpowerup.com/review/amd-ryzen-9-9950x/23.html

Regards.
 
I've already said in a previous comment that I don't think the 165w reduction is possible and that their graph is messed up. What I'm disagreeing with is your argument that the 14900k doesn't go above 100w in games, which is absolutely not the case. It can very easily - I've done it - hit 200w at stock.
I didn't say that the 14900k doesn't go above 100w in game at all if you understand ENGLISH, I said IME, CPU bottlenecked MSFS consumes ~100W (which means, it goes 9x-10x fluctuating), and that is the question of it being impossible to be reduction of 160W, your reading is as flawed as always