News Qualcomm says its Snapdragon Elite benchmarks show Intel didn't tell the whole story in its Lunar Lake marketing

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

mikeztm

Distinguished
Feb 15, 2012
13
6
18,515
No they are not? Lunar Lake leads almost every chart I've seen on baterry life.
X Elite is about 20% more energy efficient than LNL, and Apple M3 is about double the efficiency. The reason Intel LNL leads the chart is because they tested light/video playback scenario instead of heavy load, and that is not a CPU efficiency benchmark but a SoC idle/codec/screen power benchmark.
 
  • Like
Reactions: cerata

ChrisGX

Reputable
Jan 28, 2020
20
5
4,515
You're aware that this has literally nothing to do with the CPU right?
Energy efficiency was the matter at issue here not CPU performance. Maybe you missed that. Throttling extends battery life. Would you agree with that? Well guess what, when disconnected from a power cable Lunar Lake laptops throttle performance (by a lot) and as a consequence post impressive battery life results. That is a trivial win that says very little about energy efficiency.
 
Energy efficiency was the matter at issue here not CPU performance. Maybe you missed that. Throttling extends battery life. Would you agree with that? Well guess what, when disconnected from a power cable Lunar Lake laptops throttle performance (by a lot) and as a consequence post impressive battery life results. That is a trivial win that says very little about energy efficiency.
Guess who couldn't be bothered to actually watch the video, and here since you clearly didn't (hint everything drops on battery and some are much worse than others and it's not Intel):
iGGpmvu.jpeg
 

ChrisGX

Reputable
Jan 28, 2020
20
5
4,515
But what exactly is your point? That energy efficiency contains more than just work done/power used? Are you saying that total time to complete the work also matters? Because that's certainly the case, although it's more of a scientific metric than one that "end users" actually care about.
Okay, you should read what I wrote again. Rather than going over that ground just let me ask this question (which has an obvious answer): Does a battery life test that offers, as its result, the time it took for the battery to deplete (while executing some workload) tell you how much work was completed in the test? That's right, it doesn't. So, what I am saying, which is rather obvious, is that it is utterly trivial to say that a computer is doing some work or did some work during a test like this, when what matters (because energy efficiency matters) is HOW MUCH work gets done relative to how much power gets consumed, viz. amount of work executed/W.

Now, for two laptops of equal battery capacity (whether they have very similar or very different CPUs) running some workload which drains the battery having the longer battery life is not the right criterion for determining which laptop or CPU is more efficient. The right criterion is the AMOUNT OF WORK completed before battery depletion. And, yet while these battery life tests tell us nothing useful about actual energy efficiency commentators offer them up as if that is their purpose.

As it happens, battery life extension is one of the important benefits of CPU efficiency gains. The overidentification of energy efficiency with battery life, however, can leave consumers open to certain manipulative business practices. Lunar Lake laptops throttle when disconnected from the power cable. Why? Is the objective to keep the CPU in the most efficient part of the power-performance curve or does Intel just want to win the battery life test? Almost certainly, the latter is the lead consideration. It is hardly as if Intel doesn't understand these things. It could specify a more capacious battery and run the laptop at undiminished speeds while unconnected to power and still win the battery life test if it wanted to. But instead Intel and its OEMs pretend that Lunar Lake laptops will deliver the trifecta: thin and light, long battery life and high performance all provided by the one compact unit. Only, these laptops gets throttled when disconnected from power hobbling performance but helping Lunar Lake over the line as the battery life winner (and via false association being accorded the status of energy efficiency leader as well).
 

ChrisGX

Reputable
Jan 28, 2020
20
5
4,515
No, it's not a disadvantage. Lunar Lake competes in a different, albeit overlapping, swimlane than SDXE. LNL is designed for 9-15W devices, whereas SDXE (similar to STX) is designed for 28-65ish watt power envelopes. You've got it flipped, SDXE is not a low-power chip. The fact that they're being directly compared is only because ARL-H and -HX haven't been released on the Intel side. Purwa is maybe comparable to LNL in some scenarios because it's an 8 core design like LNL, but it really is designed for less expensive laptops, not necessarily low power ones.
You should probably read all of my comments here, if you really want to know where I'm coming from. But broadly, I think you are wrong about most of what you just said. The Oryon core based Snapdragon (which is in its second generation) isn't about observing market segmentation lines that Intel has laid down over the years for its own benefit, it is about upending those lines and thereby taking business from incumbents whose products service the ill formed market that they created and for that reason leave them open to challenge.

If ARM's experience is anything to go by (ARM is probably a good touchstone because, globally, more of their licensed cores get incorporated into products each year than any other type and because they license cores catering to the full spectrum of computer, equipment and appliance manufacturers) the only significant demarcation line, that is based on more than just vendor defined and maintained market segments, is the line between consumer computing (for lack of better terms) and commercial computing. In each case the demands of the sphere heavily weigh on core design (and much else). For, this reason Qualcomm won't be making a server chip any time soon - it doesn't have the necessary license from ARM - but Intel should expect that Qualcomm will take them on everywhere else, across laptops and all the way up to the upper reaches of performance desktops.

Many things remain unknown, of course. Will Qualcomm embrace gaming? It will need partners if it does. Will Qualcomm embrace standards from the PC world and start making mainboards? I doubt it but they might create some useful royalty free standards of their own.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
X Elite is about 20% more energy efficient than LNL, and Apple M3 is about double the efficiency. The reason Intel LNL leads the chart is because they tested light/video playback scenario instead of heavy load, and that is not a CPU efficiency benchmark but a SoC idle/codec/screen power benchmark.
And a heavy load scenario isn't an efficiency benchmark either, the cpu that is locked to the lowest power will win.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
Does a battery life test that offers, as its result, the time it took for the battery to deplete (while executing some workload) tell you how much work was completed in the test? That's right, it doesn't
You are wrong, it does. Video playback, the one that had the highest runtime also did the most work
 

ChrisGX

Reputable
Jan 28, 2020
20
5
4,515
Guess who couldn't be bothered to actually watch the video, and here since you clearly didn't (hint everything drops on battery and some are much worse than others and it's not Intel):
iGGpmvu.jpeg
Objectively, what the Hardware Canuck's video shows is a top three finish of the X Elite laptop in all battery tests except one. The broad picture is the X Elite laptop returns results rather similar to the best Lunar Lake laptop. Apple's M3 laptop sets the standard in battery life, it should be said, neither Lunar Lake nor the X Elite.

The problematic test for the X Elite is the video playback test on VLC. The same problem isn't apparent when streaming video on YouTube (at 4K just like the VLC test). Don't you think that somewhat strange? Part of the explanation is that VLC runs in emulation on the X Elite. If your going to use VLC...yawn...on a X Elite laptop you should be aware of that.

None of these tests constitute genuine energy efficiency tests. Admittedly, for a computer of a given architecture that performs and behaves much like other computers built to the same spec it is possible, using a controlled set-up, to get a comparative sense of energy efficiency by running almost any program continuously (assuming that works) until the computer stops. When conducting such a test the computer that stays alive the longest wins. But the time on the stop watch can be deceptive. The laptop that stays alive the longest won't always be the one that completes the greatest amount of work. The greatest amount of (useful) work completed for a given apportionment of charge is what matters.

Energy efficiency can be determined precisely as long as we look at in terms of units of work completed/Wh. But we don't see efficiency tests like that, sadly. At the very least, a genuine energy efficiency test would involve the counting of significant results/units of work, which could be math calculations or fully rendered scenes or a lot of other things, in a manner somewhat akin to a benchmark with the test ending at the point that a predetermined charge limit is expended. There is nothing wrong with battery tests that give results in minutes and seconds but we shouldn't burden those results with meanings and associations that they don't support.
 

ChrisGX

Reputable
Jan 28, 2020
20
5
4,515
You are wrong, it does. Video playback, the one that had the highest runtime also did the most work
I would agree that video playback is a special case but I hold to my claim viewed more broadly (than the way you evidently would like to view it).

My response to thestryker moments ago covers this point.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
I would agree that video playback is a special case but I hold to my claim viewed more broadly (than the way you evidently would like to view it).

My response to thestryker moments ago covers this point.
Don't you agree that PCmark is the most useful thing to run on thin and light laptops, since usually thin and lights are used for everyday workloads (excel, browsing, etc.) instead of running cinebench on a loop? Cause in PCmark10 lunar lake does exceptional, sadly we don't have numbers from the arm competitors but the amd chip consumes 47% more power.
 

ChrisGX

Reputable
Jan 28, 2020
20
5
4,515
For anyone shocked by the claim that Apple and Qualcomm processors lead in performance and energy efficiency in consumer computing they should just ponder Pat Gelsinger's unexpected public show of unity with AMD, recently. Pat is quaking in his boots and what has him scared is ARM architecture processors. AMD and Nvidia too, of course, but ARM most of all.

While the Intel faithful still want to believe that all is well Intel's processors oxidise their way to an early death. Meanwhile, consumers genuinely concerned about energy efficiency have left the Intel tent and have turned their attention to Apple's and Qualcomm's most up to date processors. Others, somewhat concerned about energy efficiency but more particularly concerned about great gaming performance at a reasonable price buy AMD.
 
Objectively, what the Hardware Canuck's video shows is a top three finish of the X Elite laptop in all battery tests except one. The broad picture is the X Elite laptop returns results rather similar to the best Lunar Lake laptop. Apple's M3 laptop sets the standard in battery life, it should be said, neither Lunar Lake nor the X Elite.

The problematic test for the X Elite is the video playback test on VLC. The same problem isn't apparent when streaming video on YouTube (at 4K just like the VLC test). Don't you think that somewhat strange? Part of the explanation is that VLC runs in emulation on the X Elite. If your going to use VLC...yawn...on a X Elite laptop you should be aware of that.

None of these tests constitute genuine energy efficiency tests. Admittedly, for a computer of a given architecture that performs and behaves much like other computers built to the same spec it is possible, using a controlled set-up, to get a comparative sense of energy efficiency by running almost any program continuously (assuming that works) until the computer stops. When conducting such a test the computer that stays alive the longest wins. But the time on the stop watch can be deceptive. The laptop that stays alive the longest won't always be the one that completes the greatest amount of work. The greatest amount of (useful) work completed for a given apportionment of charge is what matters.

Energy efficiency can be determined precisely as long as we look at in terms of units of work completed/Wh. But we don't see efficiency tests like that, sadly. At the very least, a genuine energy efficiency test would involve the counting of significant results/units of work, which could be math calculations or fully rendered scenes or a lot of other things, in a manner somewhat akin to a benchmark with the test ending at the point that a predetermined charge limit is expended. There is nothing wrong with battery tests that give results in minutes and seconds but we shouldn't burden those results with meanings and associations that they don't support.
You wrote a giant wall of text completely avoiding the point being made: you keep claiming that on battery LNL tanks performance but X Elite doesn't and this shows that you're wrong.
 

TheHerald

Respectable
BANNED
Feb 15, 2024
1,633
501
2,060
but more particularly concerned about great gaming performance at a reasonable price buy AMD.
Yeah no. I keep reading this and it makes me bonkers. Obviously you are talking about the x3d. It offers 10% more gaming performance at 50% extra price over (or even 150% extra price) compared to a normal amd or intel chip (7600, 7700, 13600kf etc.). People who make these kinds of claims are flat out lying, the x3d is 2.5 times the price of the 7600 for 10% extra performance at any realistic gaming scenario.

That's less reasonable than a 4090 is mind you.
 

ChrisGX

Reputable
Jan 28, 2020
20
5
4,515
Don't you agree that PCmark is the most useful thing to run on thin and light laptops, since usually thin and lights are used for everyday workloads (excel, browsing, etc.) instead of running cinebench on a loop? Cause in PCmark10 lunar lake does exceptional, sadly we don't have numbers from the arm competitors but the amd chip consumes 47% more power.
Yes, kind of. The looping isn't really a problem, as I see it, but PCmark 10 would probably constitute a more telling workload for most computer users because it will better approximates their usage. And, I take your point, if there is a way to very accurately measure CPU power usage (and thus power consumption) while running a test workload you don't really have to count results, such as runs, say, because you already have the information that would be returned by the kind of testing that I have spoken about.

I think it would be wise to conduct higher load tests than executing PCmark alone, though. Why? Because it is important to know how computer behaves in challenging situations when a lot is going on. Looping, in such a high load testing scenario, adds to the challenge, generating data on performance drops and the efficiency compromises (over extended periods) that ideally we should be aware of.
 

ChrisGX

Reputable
Jan 28, 2020
20
5
4,515
You wrote a giant wall of text completely avoiding the point being made: you keep claiming that on battery LNL tanks performance but X Elite doesn't and this shows that you're wrong.
That's not true. You just don't like hearing that your understanding of energy efficiency is skewed and false. Evidently, you also don't like how easy it is to point out the holes in the 'inefficient X Elite' story. In a discussion primarily centred on energy efficiency you haven't even ventured to define what that is.

I suspect you particularly don't like hearing that Apple's and Qualcomm's latest cores (you keep referring to the Gen 1 Oryon based X Elite blissfully unaware that Gen 2 devices are already in the hands of testers, that benchmark results have already been posted and that consumer devices using the Gen 2 SoCs are releasing this month) are far superior to Intel's Lunar puddle in both performance and energy efficiency. The thing is they are!
 
Aug 18, 2024
31
11
35
(you keep referring to the Gen 1 Oryon based X Elite blissfully unaware that Gen 2 devices are already in the hands of testers, that benchmark results have already been posted and that consumer devices using the Gen 2 SoCs are releasing this month) are far superior to Intel's Lunar puddle in both performance and energy efficiency. The thing is they are!
Oryon 1.5 is only in phones (yes it's a 1.5, minor uarch tweaks and a new node - not a 2.0)
 
Aug 18, 2024
31
11
35
Energy efficiency was the matter at issue here not CPU performance. Maybe you missed that. Throttling extends battery life. Would you agree with that? Well guess what, when disconnected from a power cable Lunar Lake laptops throttle performance (by a lot) and as a consequence post impressive battery life results. That is a trivial win that says very little about energy efficiency.
to solve this you... test energy efficiency. Man, I don't know what you're trying to argue here. Is SDXE more efficient in Cinebench 2024 nT? Yes! It is! It's a 12 core design vs. an 8 core one, LNL is disadvantaged. If you want to toot your horn about the ARM revolution, please do so once ARL-H comes out. I will gladly eat my words if it fails to at least match SDXE in Cinebench 2024 nT efficiency.
 

ChrisGX

Reputable
Jan 28, 2020
20
5
4,515
to solve this you... test energy efficiency. Man, I don't know what you're trying to argue here. Is SDXE more efficient in Cinebench 2024 nT? Yes! It is! It's a 12 core design vs. an 8 core one, LNL is disadvantaged. If you want to toot your horn about the ARM revolution, please do so once ARL-H comes out. I will gladly eat my words if it fails to at least match SDXE in Cinebench 2024 nT efficiency.
Test energy efficiency...hmm. Do you imagine that, without further specification, those three words strung together have a meaning that amounts to anything? Be precise and set out the test, the testing conditions and any controls or normalisations to avoid test results being rubbish.

Forget, computing for a moment, and imagine a test of the efficiency of a number of petrol powered vehicles. We should definitively set a strict requirement that the test should be conducted using an exactly equal amount of fuel. Similarly, the test should be on the same road/test track and climactic conditions should be roughly the same. The cars, whatever other differences they may have between them should be the same weight when carrying the driver and fuel. There is no need to have any requirements on aerodynamics because if poor aerodynamics should compromise fuel economy so be it. Tyres, though, should be as similar as possible without compromising designed vehicle gearing.

For the sake of this test there is no need to look under the bonnet. The displacement of the engine simply doesn't matter. Whether the engine is a piston engine or a rotary engine likewise doesn't matter. Every driver is familiar with the notion of conserving fuel but they are are permitted to exceed marked speed limits if they choose to. Contrariwise, they may elect to drive in a leisurely fashion if they prefer, but not too leisurely because that rule of the road that says you shouldn't unnecessarily hold up other traffic does apply here.

So, subject to all of these conditions what would attaining top spot/excelling in the fuel economy test look like? The winner of the test would be the car that goes the furthest down the road thus exhibiting the highest km/l ratio. That car might be the one that continues to drive the longest before running out of fuel or it might not be. Top place goes to the vehicle that goes further. Energy efficiency in the described scenario comes down to the distance travelled (on a given provision of fuel). That is what matters, not the time it took for the fuel to run out.

Similarly, the energy efficiency of a computing chip is not determined by the time it takes that chip to deplete a battery but rather by how much work is completed while the battery charge drains to empty. The measure of energy efficiency isn't the battery run down time but rather the ratio of units of computing work executed/Wh.

Units of computing work executed might be something like repeated runs of a demanding benchmark processed on a loop. Think of a benchmark like SPEC 2017 or Linpack. These benchmarks will run at different execution rates on different chips. A high performance chip will execute these workloads more quickly than a low power chip but high performance chips and low power chips aren't all born equal. Some will exhibit better energy efficiency than others. Apple silicon, for instance, is both performant and energy efficient. At 10W Apple's M3 offers extraordinary levels of performance at great energy efficiency. Apple tablets and laptops don't throttle much when disconnected from a power cable. Lunar Lake processors, though, do throttle once disconnected from power - they have to because operating at full speed they would run their batteries down at a rapid rate. The battery run down time of a Lunar Lake laptop and an M3 based laptop (with a smaller battery) are in the same ballpark but an M3 laptop gets a lot more work done as it runs down making it much more energy efficient device.

If you want to continue this I would ask that you try to offer something of substance - a definition of energy efficiency, perhaps, or a specific criticism of a point that I have made - rather than offering bland and circular incantations that lead nowhere.