News Detailed specs for dozen Intel Arrow Lake desktop CPUs leaks ahead of touted October 10 launch date

Status
Not open for further replies.

Neilbob

Distinguished
Mar 31, 2014
266
350
19,720
I really do hope that the power consumption of these is much improved, and is at least comparable to AMD; I'm here for that. I am very tired of the ridiculous pursuit of performance-at-any-cost angle Intel has been taking the last few years (AMD too, to a lesser extent in case anyone thinks I'm being biased). They started going all in with the 9900K, and it's been downhill ever since.

And in case some people emerge to blather about Raptor Lake, yes, I know that efficiency can be drastically improved by tweaking clock/voltage settings and yada yada, etc, but my interest in faffing about with such settings has evaporated in recent years - and the enormous majority of users wouldn't even know if or how. If some people want to jigger about with overclocking and voltages then let them, but default, out-of-the-box characteristics are far more important, and I'd much rather take a 5% performance hit in order to save 30% or more power (for example. I don't know the numbers).

I don't want to see yet more outrageous power consumption bars in the reviews. Crossed fingers I'm not being naïve here.
 
Mar 12, 2024
17
28
40
I really do hope that the power consumption of these is much improved, and is at least comparable to AMD; I'm here for that. I am very tired of the ridiculous pursuit of performance-at-any-cost angle Intel has been taking the last few years (AMD too, to a lesser extent in case anyone thinks I'm being biased). They started going all in with the 9900K, and it's been downhill ever since.

And in case some people emerge to blather about Raptor Lake, yes, I know that efficiency can be drastically improved by tweaking clock/voltage settings and yada yada, etc, but my interest in faffing about with such settings has evaporated in recent years - and the enormous majority of users wouldn't even know if or how. If some people want to jigger about with overclocking and voltages then let them, but default, out-of-the-box characteristics are far more important, and I'd much rather take a 5% performance hit in order to save 30% or more power (for example. I don't know the numbers).

I don't want to see yet more outrageous power consumption bars in the reviews. Crossed fingers I'm not being naïve here.
It's a new node 20A, or ~2nm, should be close to same density as TSMC's leading node, I would expect efficiency to be considerably improved over 7nm that was used for 12-14th gen.
 

bit_user

Titan
Ambassador
The article said:
Can Intel rise up from the ashes with its Arrow Lake chips?
Ashes??? They might be on fire, but they're certainly not a heap of ashes!

I really do hope that the power consumption of these is much improved,
...
I don't want to see yet more outrageous power consumption bars in the reviews. Crossed fingers I'm not being naïve here.
If not, there's always Bartlett Lake. Said to launch with up to 12P cores, in early 2025.

BTW, if we do see something like a climb-down from high TDPs, perhaps it'll be more due to things like practical limits on thermal density holding a lid on clock speeds, than either company deciding to go Eco.
 
Last edited:

bit_user

Titan
Ambassador
Integrated GPU means what? Graphics? NPU? Both? Neither? What?
Good question. The F models normally lack graphics, but whether or not they'll retain the NPU remains to be seen.

If I had to guess, I'd say probably no NPU. The reason being that you pretty much only buy a F model if you've got a dGPU and those have even higher AI performance. At that point, the integrated NPU becomes unnecessary. So, it would make sense for Intel to lump together all dies with defects in either the iGPU or NPU as "F" models.

Even the -5 level chips seem like plenty to me. What are they thinking?
These core counts match those of Raptor Lake S. The thread counts are lower, due to the loss of hyperthreading. So, they certainly can't afford to step back on core counts, or else they'd really be hurting on MT performance.
 
Intel cpus without graphics it's no go for me.

- You can drive a FULLHD display at 165hz with less than 1w
- Using the IGPU you can shove almost 10w from the GTX 4060Ti 16gb from 17w to 7w (idle)
- AMD or NIVIDIA can't Display at 165hz or more with out BREAK the POWER! 40w or more for nothing
- Works with some softwares to record your game play, removing some CPU usage
- Can Display multiple video wallpapers without Drain your house down over the wall
- You can use Intel quick sync to speed up some steve jobs

PS - FOR AMD REASON and only AMD reasons Don't WORK WELL WITH AMD CARDS.
 

bit_user

Titan
Ambassador
Intel cpus without graphics it's no go for me.

- You can drive a FULLHD display at 165hz with less than 1w
I'm sure that if you measured your PC power at the wall in display power saving vs. showing desktop, the delta would be more than 1 W. The reason you shouldn't look at just package power or self-reported GPU power is that it doesn't capture all of the components that measuring a GPU would, including RAM and PHY.

That's about what I'm seeing, even on a lowly Alder Lake N97. In fact, I'm seeing about a 1.3 - 1.5 W delta and it has just single-channel DDR5-4800 and is currently hooked up to a 1080p monitor @ 60 Hz.

- AMD or NIVIDIA can't Display at 165hz or more with out BREAK the POWER! 40w or more for nothing
Well...

meh out of all the titles they have listed for the RT I only have 2 and that is both Spiderman Remastered and Miles and they both play fine for me at 3440x1440 UW with RT on.

"4080 Super used just 16.5W while idle, 20.8W while playing back a 4K AV1 video using VLC, and 31.0W when playing the same video directly on YouTube. The 7900 XTX used 19.1W while idle, 57.8W while playing our test video in VLC, and 96.8W when viewing the video on YouTube."

idle power draw will depend on monitor setup.

I idle at 7 watts on my 7900XTX and single display not the 19 watts you are seeing in this review.

Youtube video play back for me sits at 32 watts not 3x the amount you are seeing at 96.8 watts....

VLC video play back for me on my system is 26 watts.

Jarred subsequently clarified that his 19.1W figure from the RX 7900 XTX was for a 4K display running at 144Hz. Makaveli said the 7 W figure was for 3440x1440 @ 144hz.

Both are far below your claim of 40 W, and both are above the pixel scanout rate of 2560x1440 @ 165 Hz. That's also a big GPU with lots of GDDR6 and L3 cache. The number should scale down pretty well, for smaller GPUs.
 
Last edited:

rluker5

Distinguished
Jun 23, 2014
913
594
19,760
I think the large increase in base clocks at the same power with a reduction in boost clocks is evidence that the power consumption of the unlocked chips will be significantly lower at stock settings and max usage..

Not proof, just supporting evidence.
 
The thing I'll be looking for when these come out is the performance scaling as the power increases. While I would never let a CPU off of it's leash RPL is interesting in that once you reach a certain point performance scales linearly with increased power consumption (it's not what I would consider worthwhile as it is awful efficiency wise). If ARL can maintain higher clocks within power limits similar to what AMD has done with Zen 4/5 then it ought to be a winner.
 
  • Like
Reactions: KyaraM

Thunder64

Distinguished
Mar 8, 2016
207
302
18,960
Intel cpus without graphics it's no go for me.

- You can drive a FULLHD display at 165hz with less than 1w
- Using the IGPU you can shove almost 10w from the GTX 4060Ti 16gb from 17w to 7w (idle)
- AMD or NIVIDIA can't Display at 165hz or more with out BREAK the POWER! 40w or more for nothing
- Works with some softwares to record your game play, removing some CPU usage
- Can Display multiple video wallpapers without Drain your house down over the wall
- You can use Intel quick sync to speed up some steve jobs

PS - FOR AMD REASON and only AMD reasons Don't WORK WELL WITH AMD CARDS.

I never understood why someone would save $ 30 on a F CPU when you give up so much. An iGPU that is servicable if you are w/o a video card for some reason or end up selling it or giving it away. Also, Quicksync. Maybe not as useful these days as in the past but still a nice feature for only a little more money.
 

baboma

Respectable
Nov 3, 2022
284
338
2,070
>I really do hope that the power consumption of these is much improved

It helps to understand the circumstances. Intel was stuck with using its 10nm ESF process node for ADL/RPL, because that's what its fabs could only produce, whereas Ryzen 7000 can avail of TSMC's 5nm (Ryzen 5000 uses TSMC 7nm). The only way to offset the node disadvantage is to use more power.

In fact, it was shocking how superior ADL was to Ryz 5000 despite its inferior node, thanks to its hybrid P/E arch. This was above and beyond the higher power level.

Arrow Lake will be on Intel 20A (2nm). By that alone, ARL will be more efficient.

>And in case some people emerge to blather about Raptor Lake, yes, I know that efficiency can be drastically improved by tweaking clock/voltage settings and yada yada, etc, but my interest in faffing about with such settings has evaporated in recent years

Just go into settings and use the Intel Default Settings if it's not already the default. You can use either the Baseline (for efficiency) or Performance. For the top SKU, there's also the Extreme profile. All motherboards should have these by now.

But even before that, it's no harder to change PL1/PL2 to Intel's default, than it is to enable XMP mode. There's no need to get into the weeds. One of the defining characteristics of DIY is the ability to configure your system's CMOS settings.

>I don't want to see yet more outrageous power consumption bars in the reviews.

Yes, that's one of several goofy benchmark practices I don't agree with, using the unlimited power profile that most motherboards have as default, for "benchmarking wins." The rationale was that HW sites want to reflect real-world use, as most people would just use whatever default power profile. But that "real-world" rationale doesn't hold water when sites then use 4090 to mitigate GPU bottlenecks.

The RPL instability problem would be a blessing in disguise, if it can make this idiotic practice of "unlimited power profile" benchmarking to go away.


>Integrated GPU means what? Graphics? NPU? Both? Neither? What?

ARL will be getting Alchemist (Xe) cores. Desktop SKUs get 64 Xe (4 cores), half of what mobile parts get. The lowest SKU (225) gets 32 Xe or 2 cores. As with Ryzen 9K, no NPU for ARL for this generation.
 
Last edited:
  • Like
Reactions: KyaraM and vongole

bit_user

Titan
Ambassador
I never understood why someone would save $ 30 on a F CPU when you give up so much.
I wouldn't, but I can imagine two scenarios where someone might. The first would be a system builder who's trying to cut costs to offer more competitive prices or increase their margins. The second is a DIY builder who's already stretching, just to afford the F model.

Yeah, having the iGPU is a nice fallback & for the other things you mentioned, but not strictly necessary if you know you're going to have a dGPU. For years, people with HEDT CPUs and (pre-7000) chiplet-based Ryzen systems have gotten by alright without an iGPU.
 

bit_user

Titan
Ambassador
Intel was stuck with using its 10nm ESF process node for ADL/RPL, because that's what its fabs could only produce, whereas Ryzen 7000 can avail of TSMC's 5nm (Ryzen 5000 uses TSMC 7nm). The only way to offset the node disadvantage is to use more power.
Intel can & did use more die area, which is something they could more easily afford to do as a result of their vertical integration (i.e. they own the fabs that make those CPUs).

In fact, it was shocking how superior ADL was to Ryz 5000 despite its inferior node,
OMG, node names...

Intel renamed their 10 nm ESF to Intel 7 precisely because they felt it should be comparable to TSMC N7.

Arrow Lake will be on Intel 20A (2nm). By that alone, ARL will be more efficient.
Well, it's not like microarchitecture or clockspeed have nothing to do with it!

no NPU for ARL for this generation.
What's your source on that? It contradicts the leaks I've seen, although I didn't put a lot of effort into searching.
 
Last edited:
@bit_user the rx6700xt draw 40w with one screen at 165hz. The gtx 4060ti got 35w because the 16gb memory wake up!

I have my computer plugged on those watt metter every little change I see on the fly :)
And 40w it's the entire machine working
My computer idle at 50w on the wall
It's hard to keep the lowest draw possible

Intel igpu make wonders on efficiency!
 

baboma

Respectable
Nov 3, 2022
284
338
2,070
>I never understood why someone would save $ 30 on a F CPU when you give up so much.

Many people go by what HW sites recommend. For Intel, KF parts are the vogue for gaming rigs.

It's similar to saying "get 7800X3D for best gaming performance." While generally true, it comes with a sizeable caveat, which is already having a top-end GPU.

DIY PC has varying levels of proficiency, with attendant trade-offs. If you're an enthusiast who spends your free time frequenting HW sites for the latest PC tidbits (read: everyone here), then you're more able to understand the nuances. You trade time for knowledge. More "casual" DIYers only look at the bottom line recommendations--KF is best, X3D is best, etc--for their buying guide. More superficial knowledge, but also less time invested.
 
  • Like
Reactions: KyaraM

bit_user

Titan
Ambassador
Intel igpu make wonders on efficiency!
I agree. I'm just saying it's definitely more than 1W for a full-sized desktop CPU, given my little N97 takes more than that for lower-res, lower refresh.

Also, a bigger, more powerful GPU will always idle higher. It's a bit like complaining about the gas mileage of a big rig truck when it's running without a trailer. You don't buy them for that - you use them to carry a load that you couldn't haul otherwise. So, someone not needing a big dGPU probably shouldn't use one. Simple as that.

Finally, don't get me wrong: I do like Intel iGPUs. I've used them as my only GPU in 4 different PCs I've owned since Sandybridge, with only one of them being a laptop.
 
  • Like
Reactions: KyaraM and Amdlova

baboma

Respectable
Nov 3, 2022
284
338
2,070
>>no NPU for ARL for this generation.

>What's your source on that?

Same as yours and the sites'--Videocardz, Wccftech, etc, which in turn get theirs from the usual suspects on X.

I inferred "no NPU" by its omission from the latest leaks. If it were of any significance, it would've been mentioned.

Per Googling, apparently earlier leaks have said ARL will get NPU w/ 13 TOPS. That would make sense, since ARL inherited MTL's Alchemist iGPU, it probably inherited MTL's NPU as well.

Either way, it's insignificant enough to be viewed as "vestigial." ARL's NPU in 2025 will be as useful as MTL's in 2024.
 
  • Like
Reactions: KyaraM and bit_user

TheHerald

Respectable
BANNED
Feb 15, 2024
1,630
502
2,060
I really do hope that the power consumption of these is much improved, and is at least comparable to AMD; I'm here for that. I am very tired of the ridiculous pursuit of performance-at-any-cost angle Intel has been taking the last few years (AMD too, to a lesser extent in case anyone thinks I'm being biased). They started going all in with the 9900K, and it's been downhill ever since.

And in case some people emerge to blather about Raptor Lake, yes, I know that efficiency can be drastically improved by tweaking clock/voltage settings and yada yada, etc, but my interest in faffing about with such settings has evaporated in recent years - and the enormous majority of users wouldn't even know if or how. If some people want to jigger about with overclocking and voltages then let them, but default, out-of-the-box characteristics are far more important, and I'd much rather take a 5% performance hit in order to save 30% or more power (for example. I don't know the numbers).

I don't want to see yet more outrageous power consumption bars in the reviews. Crossed fingers I'm not being naïve here.
The non k chips are at 65w and 35w respectively. They ve been around for I don't even know how many generations now. People are just complaining for the sake of complaining honestly.
 
  • Like
Reactions: KyaraM and Amdlova

DasLooney

Distinguished
Dec 31, 2007
3
0
18,510
Why is no news and tech site talking about how the upcoming new Intel chips will have the same fatal flaws as the past 2 unstable generation chips?!!!
 

Sippincider

Reputable
Apr 21, 2020
154
115
4,760
Why do they need a dozen desktop CPUs of the same generation? Binning?

Why not focus on being competitive with a small handful, esp. when you’ve struggled like Intel has?
 
  • Like
Reactions: JRStern

JRStern

Distinguished
Mar 20, 2017
177
67
18,660
Why do they need a dozen desktop CPUs of the same generation? Binning?

Why not focus on being competitive with a small handful, esp. when you’ve struggled like Intel has?

It's a marketing thang.
The varieties are simple to generate, and maybe one will turn out to be super-popular.
Meanwhile the variety lets every manufacturer focus on a different one so you can avoid head to head comparisons, and everyone can have one or two strong points.
As a customer I find it more irritating than otherwise, but perhaps it's a good thing, indeed I may prefer an option that others might not, that even Intel might not.
 
  • Like
Reactions: KyaraM
Status
Not open for further replies.