News Qualcomm says its Snapdragon Elite benchmarks show Intel didn't tell the whole story in its Lunar Lake marketing

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
Battery life, you should be aware, has a complex relationship to energy efficiency. Things can easily be fudged. One way to give a glowing impression about energy efficiency is to use a large battery but a long battery life in that context obviously won't necessarily indicate a frugal use of power. Actually, that isn't the trick being used by thin and light laptop manufacturers equipping their laptops with Core Ultra chips. The trick they use is to radically reduce performance when the laptop isn't connected to the power cable. So, you do get long battery life with Lunar Lake but you don't get high performance during battery discharge. Nice trick and a very effective deception that makes a computer that isn't especially energy efficient seem as if it is.
Youtube video playback for example can't be cheated, all chips perform the same task. Still lunar lake is winning. Eg.

View: https://youtu.be/CxAMD6i5dVc?t=414
 
Sep 6, 2024
6
7
15
If you think LL is more efficient continue to do so and hope in a bright future.
You're repeating something that is no longer current or true since Meteor Lake. Check various sites for performance comparisons of Lunar Lake on battery vs power supply, and you'll be surprised. Sometimes it's faster on battery.
 
  • Like
Reactions: shady28

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
If you think LL is more efficient continue to do so and hope in a bright future.
It's not about what I think, what I think is irrelevant. Tests reveal that it's high up the charts in battery life. Especially on everyday tasks that think and light laptops are mainly used for.

For example notebookcheck tested under PCMARK10, sadly they didn't compare against an arm chip but against the hx370 from AMD. The amd chip used 55% more power. Insanity.
 
  • Like
Reactions: rluker5

ChrisGX

Reputable
Jan 28, 2020
20
5
4,515
Lunar Lake is not a higher spec chip. It's literally meant for everyday usage and baterry life. Arrow lake will be the high spec chip and it will be a lot faster than the elite.
I don't think you're quite getting my point. It is simply an absurd and unreasonably self-serving proposition that a low power Snapdragon Elite SoC (2nd generation Oryon core based mobile chip) should perform at the level of a Core 7 Ultra laptop chip. Shockingly, when the Snapdragon Elite does perform at that level (at least for short bursts) advocates of the virtues of Intel processors recalibrate their requirements and demand that the low power mobile chip should perform at the level of a much more power hungry Intel Arrow Lake desktop processor. Do you know how ridiculous that sounds?
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
I don't think you're quite getting my point. It is simply an absurd and unreasonably self-serving proposition that a low power Snapdragon Elite SoC (2nd generation Oryon core based mobile chip) should perform at the level of a Core 7 Ultra laptop chip. Shockingly, when the Snapdragon Elite does perform at that level (at least for short bursts) advocates of the virtues of Intel processors recalibrate their requirements and demand that the low power mobile chip should perform at the level of a much more power hungry Intel Arrow Lake desktop processor. Do you know how ridiculous that sounds?
According to notebookcheck the X1e 78 consumes over twice the power of the 258v under load, what low power are you talking about?

https://www.notebookcheck.net/Asus-...ay-laptop-with-Intel-Lunar-Lake.892978.0.html

Scroll down to the battery runtime under load graph and youll see it.
 
Oct 22, 2024
2
3
15
Historically QC has tweaked the benchmarks and essentially offer an overclock perf mode. Just for these benchmarks. QC engineers have always told OEMs to never run this mode as default. If a customer runs it in this mode for normal use: guarantee the chip will overheat and throttle down 50% after a few minutes to save the chip, mind that battery life take a dump.
 
  • Like
Reactions: cyrusfox

dimar

Distinguished
Mar 30, 2009
1,103
94
19,360
I don't share that disposition. What matters to me is speed and energy efficiency and a suitable array of software that meets my needs and runs well. That some old software applications only run in emulation or don't run at all doesn't concern me in the slightest. So, while ever there are large numbers of people like me, people like you with your particular disposition will not represent a significant obstacle to the success of the Oryon/ARM architecture (on smartphones or laptops).

Also, remember, we are talking about the 8 Elite chip here. The only choice is between the Android apps that it (or other licensed ARM hardware) runs and the iOS apps that Apple's custom ARM silicon runs. Worrying about x86 backward compatibility never arises for smartphone owners.
Until Qualcomm starts providing Android drivers for 10 years, I will not support their mobile products either.
 
  • Like
Reactions: CelicaGT
And what are those benefits? For consumer.
Please don't say battery life or price because they aren't (at least now).
If you're gonna quote, quote the whole sentence not a snippet, because the sarcasm there (though admittedly subtle) flew right over your head. The finisher there "An example might be Windows on ARM with all the benefits that may or may not provide as an incentive for consumer or corporate buy in." is a little tongue in cheek on how customers may not get a choice in some brackets. We're seeing this all the time in other industries and even in this one. I DON'T WANT IT EITHER. But I am pointing out to you guys that an ARM transition in the PC space has a probability of much greater than zero.
 

user7007

Commendable
Mar 9, 2022
45
33
1,560
The problem is market never asked for such a thing. No one (in market-significant quantities) asked for Windows on non-x86. No one wants Windows as OS (and especially Windows 11) – people have to use it for compatibility with their software. So the whole Windows-on-ARM makes sense only for MS and Qualcomm, not the customers.


Apple is a very different case. Apple completely controls the software ecosystem so the actual switch was from "old-Apple" to "new-Apple". They could've switched to anything else than ARM and nothing would change.
I like windows tbh. I use linux extensively for servers and all sorts of things but windows desktop works pretty well for me. it's not flawless of course but it's decent. I'd use windows on ARM once things are sorted. I'm not interested in beta testing it for MS and Qualcomm though. Somebody else can do that nonsense.
 
Aug 18, 2024
31
11
35
Both Intel and AMD better be working on ARM offerings, this is coming whether we like it or not. The marketing engine is strong this time around, and Qualcomm isn't to be trifled with.
ISA has nothing to do with performance from a purely architectural perspective. They're both old CISC instruction sets. The only thing that changes the story is compatibility, where x86 currently has a win on its hands. Qualcomm has a competent architecture. Great. Now they need to work on compatibility.
 
Aug 18, 2024
31
11
35
Battery life, you should be aware, has a complex relationship to energy efficiency. Things can easily be fudged. One way to give a glowing impression about energy efficiency is to use a large battery but a long battery life in that context obviously won't necessarily indicate a frugal use of power. Actually, that isn't the trick being used by thin and light laptop manufacturers equipping their laptops with Core Ultra chips. The trick they use is to radically reduce performance when the laptop isn't connected to the power cable. So, you do get long battery life with Lunar Lake but you don't get high performance during battery discharge. Nice trick and a very effective deception that makes a computer that isn't especially energy efficient seem as if it is.
there's a fix for this. it's called "efficiency testing" or "normalizing for battery size" or even just literally "measuring power usage".
 

ChrisGX

Reputable
Jan 28, 2020
20
5
4,515
According to notebookcheck the X1e 78 consumes over twice the power of the 258v under load, what low power are you talking about?

https://www.notebookcheck.net/Asus-...ay-laptop-with-Intel-Lunar-Lake.892978.0.html

Scroll down to the battery runtime under load graph and youll see it.
You need to carefully read what I say. Qualcomm has just announced and released the Snapdragon 8 Elite (a mobile SoC using Oryon Gen 2 cores on a 3nm TSMC process node). Qualcomm has made big claims about the performance and the energy efficiency of the SoC and early testing confirms that it is both faster (in bursts) and more energy efficient than every current Intel CPU including Lunar Lake processors. I never referred to the older X Elite chip (Gen 1 Oryon cores built on a 4nm TSMC process node) that you seem to think is the state of the art at Qualcomm.

I will review Notebookcheck's moot claims about the earlier generation Oryon chip but the specific claim you stress does not square with other data out there which indicates rather similar power-performance characteristics for Oryon Gen 1 and Lunar Lake.
 

ChrisGX

Reputable
Jan 28, 2020
20
5
4,515
there's a fix for this. it's called "efficiency testing" or "normalizing for battery size" or even just literally "measuring power usage".
Yes, but that doesn't change anything I said. If you run a battery discharge test and battery A discharges faster than battery B (after normalisation) then that is only a meaningful result if there is little discrepancy in processing load/intensity as the battery discharges. The meaningful criterion is unit of workload executed per mW or W of power consumed. I trust you grasp that. With Lunar Lake laptops dropping the intensity of processing (by a lot) while disconnected from power and Snapdragon laptops continuing to operate as per normal without curtailing performance while disconnected from power then there is a big discrepancy in processing intensity between Lunar Lake and Snapdragon laptops as their batteries drain.
 

setx

Distinguished
Dec 10, 2014
263
233
19,060
If you're gonna quote, quote the whole sentence not a snippet, because the sarcasm there (though admittedly subtle) flew right over your head.
You can't get the context from short quote and require wall of text?
I've understood you well, while you've completely missed my intent: the word 'benefits' in that sentence should be 'drawbacks' because there aren't any benefits at all.

But I am pointing out to you guys that an ARM transition in the PC space has a probability of much greater than zero.
ARM was doing that for many years already, to the point of people getting tired reading the same again and again.
 

ChrisGX

Reputable
Jan 28, 2020
20
5
4,515
Until Qualcomm starts providing Android drivers for 10 years, I will not support their mobile products either.
So, I reiterate everything I have already said. Do not imagine that your personal disposition on this (which you repeatedly draw attention to in your comments) matters much to those who clearly think differently to you. Also, it should be obvious that Qualcomm, for a very a long time, has been writing all of the board level and kernel driver software necessary to permit the full use of the features of its SoCs in Android smartphones. That has happened in a way that evidently doesn't feel right to you but it has happened.
 

ChrisGX

Reputable
Jan 28, 2020
20
5
4,515
Youtube video playback for example can't be cheated, all chips perform the same task. Still lunar lake is winning. Eg.

View: https://youtu.be/CxAMD6i5dVc?t=414
Take two fully charged computers with exactly the same battery capacity. Run both machines with some high load workload. The workload must be able to produce some partial results with those results - rendering a predetermined list of scenes or objects at a given resolution and colour settings, say - being written out to memory, for instance. Will the winner of that test always be the machine that lasts the longest on battery? No, not if you understand what you are testing. More on that shortly.

Energy efficiency is greatest when a common unit of work consumes the least amount of power (mW, W or charge from batteries of equivalent capacities) but if work is being done at a quicker rate by one of the two computers under test then a relatively shortened battery life will not necessarily imply lesser energy efficiency. The winning computer in the energy efficiency test scenario that I just set out will be the computer that renders more scenes/objects before it's battery dies.

Perceptive reviewers have noted that processing intensity is reduced (by a lot) for Lunar Lake laptops when unconnected to power, viz. they get throttled, and not at all for the X Elite. The video isn't lying but you are making dubious inferences from what you are seeing. We need a sophisticated test to assess energy efficiency in this situation.

Of course, it is possible to throttle your way to efficiency (to remain always within the more optimally efficient part of the power-performance curve for a processor). Doing that though creates another problem - battery life is extended by doing as little as possible and wasting a lot of time.
 
Aug 18, 2024
31
11
35
Take two fully charged computers with exactly the same battery capacity. Run both machines with some high load workload. The workload must be able to produce some partial results with those results - rendering a predetermined list of scenes or objects at a given resolution and colour settings, say - being written out to memory, for instance. Will the winner of that test always be the machine that lasts the longest on battery? No, not if you understand what you are testing. More on that shortly.

Energy efficiency is greatest when a common unit of work consumes the least amount of power (mW, W or charge from batteries of equivalent capacities) but if work is being done at a quicker rate by one of the two computers under test then a relatively shortened battery life will not necessarily imply lesser energy efficiency. The winning computer in the energy efficiency test scenario that I just set out will be the computer that renders more scenes/objects before it's battery dies.

Perceptive reviewers have noted that processing intensity is reduced (by a lot) for Lunar Lake laptops when unconnected to power, viz. they get throttled, and not at all for the X Elite. The video isn't lying but you are making dubious inferences from what you are seeing. We need a sophisticated test to assess energy efficiency in this situation.

Of course, it is possible to throttle your way to efficiency (to remain always within the more optimally efficient part of the power-performance curve for a processor). Doing that though creates another problem - battery life is extended by doing as little as possible and wasting a lot of time.
But what exactly is your point? That energy efficiency contains more than just work done/power used? Are you saying that total time to complete the work also matters? Because that's certainly the case, although it's more of a scientific metric than one that "end users" actually care about.
 
Aug 18, 2024
31
11
35
>>Notably, Qualcomm isn't using the top configuration of the Intel Core Ultra Series 2 platform, the Core Ultra 9 288V [in its benchmark comparisons]<<

And, why would Qualcomm do that? Actually, wouldn't it be crazy to do that? Qualcomm is arguably putting itself at a big disadvantage to compare its mobile chip with a high spec Lunar Lake laptop chip. Must Qualcomm's low power mobile chip beat Intel's fastest Lunar Lake laptop chip on every performance and efficiency measure before it is considered worthy of notice?
No, it's not a disadvantage. Lunar Lake competes in a different, albeit overlapping, swimlane than SDXE. LNL is designed for 9-15W devices, whereas SDXE (similar to STX) is designed for 28-65ish watt power envelopes. You've got it flipped, SDXE is not a low-power chip. The fact that they're being directly compared is only because ARL-H and -HX haven't been released on the Intel side. Purwa is maybe comparable to LNL in some scenarios because it's an 8 core design like LNL, but it really is designed for less expensive laptops, not necessarily low power ones.
 

Nick_C

Distinguished
Apr 20, 2007
110
25
18,720
Both Intel and AMD better be working on ARM offerings, this is coming whether we like it or not. The marketing engine is strong this time around, and Qualcomm isn't to be trifled with.
If they are then they may be the only two of the three to do so in the long term - as Arm has just issued a 60-day notice period of license retirement to Qualcomm
 

systemBuilder_49

Distinguished
Dec 9, 2010
101
35
18,620
Qualcomm would like to remind you that they ARE STILL better than Lunar Lake.
AMD would like to remind you that they ARE STILL better than Lunar Lake.
Apple would like to remind you that they ARE STILL better than Lunar Lake.

With all the hype-train behind Intel, sowing FUD and excitement about a not-very exciting Intel SlowBook processor series (4 cores 8 threads in 2024? wtf are you kidding me?), the Lunar Lake chips have taken Intel from #4 in processor efficiency all the way up to #4 in processor efficiency. In other words, it moves the needle NONE.