If you think LL is more efficient continue to do so and hope in a bright future.No they are not? Lunar Lake leads almost every chart I've seen on baterry life.
If you think LL is more efficient continue to do so and hope in a bright future.No they are not? Lunar Lake leads almost every chart I've seen on baterry life.
Youtube video playback for example can't be cheated, all chips perform the same task. Still lunar lake is winning. Eg.Battery life, you should be aware, has a complex relationship to energy efficiency. Things can easily be fudged. One way to give a glowing impression about energy efficiency is to use a large battery but a long battery life in that context obviously won't necessarily indicate a frugal use of power. Actually, that isn't the trick being used by thin and light laptop manufacturers equipping their laptops with Core Ultra chips. The trick they use is to radically reduce performance when the laptop isn't connected to the power cable. So, you do get long battery life with Lunar Lake but you don't get high performance during battery discharge. Nice trick and a very effective deception that makes a computer that isn't especially energy efficient seem as if it is.
You're repeating something that is no longer current or true since Meteor Lake. Check various sites for performance comparisons of Lunar Lake on battery vs power supply, and you'll be surprised. Sometimes it's faster on battery.If you think LL is more efficient continue to do so and hope in a bright future.
It's not about what I think, what I think is irrelevant. Tests reveal that it's high up the charts in battery life. Especially on everyday tasks that think and light laptops are mainly used for.If you think LL is more efficient continue to do so and hope in a bright future.
I don't think you're quite getting my point. It is simply an absurd and unreasonably self-serving proposition that a low power Snapdragon Elite SoC (2nd generation Oryon core based mobile chip) should perform at the level of a Core 7 Ultra laptop chip. Shockingly, when the Snapdragon Elite does perform at that level (at least for short bursts) advocates of the virtues of Intel processors recalibrate their requirements and demand that the low power mobile chip should perform at the level of a much more power hungry Intel Arrow Lake desktop processor. Do you know how ridiculous that sounds?Lunar Lake is not a higher spec chip. It's literally meant for everyday usage and baterry life. Arrow lake will be the high spec chip and it will be a lot faster than the elite.
According to notebookcheck the X1e 78 consumes over twice the power of the 258v under load, what low power are you talking about?I don't think you're quite getting my point. It is simply an absurd and unreasonably self-serving proposition that a low power Snapdragon Elite SoC (2nd generation Oryon core based mobile chip) should perform at the level of a Core 7 Ultra laptop chip. Shockingly, when the Snapdragon Elite does perform at that level (at least for short bursts) advocates of the virtues of Intel processors recalibrate their requirements and demand that the low power mobile chip should perform at the level of a much more power hungry Intel Arrow Lake desktop processor. Do you know how ridiculous that sounds?
Until Qualcomm starts providing Android drivers for 10 years, I will not support their mobile products either.I don't share that disposition. What matters to me is speed and energy efficiency and a suitable array of software that meets my needs and runs well. That some old software applications only run in emulation or don't run at all doesn't concern me in the slightest. So, while ever there are large numbers of people like me, people like you with your particular disposition will not represent a significant obstacle to the success of the Oryon/ARM architecture (on smartphones or laptops).
Also, remember, we are talking about the 8 Elite chip here. The only choice is between the Android apps that it (or other licensed ARM hardware) runs and the iOS apps that Apple's custom ARM silicon runs. Worrying about x86 backward compatibility never arises for smartphone owners.
If you're gonna quote, quote the whole sentence not a snippet, because the sarcasm there (though admittedly subtle) flew right over your head. The finisher there "An example might be Windows on ARM with all the benefits that may or may not provide as an incentive for consumer or corporate buy in." is a little tongue in cheek on how customers may not get a choice in some brackets. We're seeing this all the time in other industries and even in this one. I DON'T WANT IT EITHER. But I am pointing out to you guys that an ARM transition in the PC space has a probability of much greater than zero.And what are those benefits? For consumer.
Please don't say battery life or price because they aren't (at least now).
Agreed.Until Qualcomm starts providing Android drivers for 10 years, I will not support their mobile products either.
I like windows tbh. I use linux extensively for servers and all sorts of things but windows desktop works pretty well for me. it's not flawless of course but it's decent. I'd use windows on ARM once things are sorted. I'm not interested in beta testing it for MS and Qualcomm though. Somebody else can do that nonsense.The problem is market never asked for such a thing. No one (in market-significant quantities) asked for Windows on non-x86. No one wants Windows as OS (and especially Windows 11) – people have to use it for compatibility with their software. So the whole Windows-on-ARM makes sense only for MS and Qualcomm, not the customers.
Apple is a very different case. Apple completely controls the software ecosystem so the actual switch was from "old-Apple" to "new-Apple". They could've switched to anything else than ARM and nothing would change.
ISA has nothing to do with performance from a purely architectural perspective. They're both old CISC instruction sets. The only thing that changes the story is compatibility, where x86 currently has a win on its hands. Qualcomm has a competent architecture. Great. Now they need to work on compatibility.Both Intel and AMD better be working on ARM offerings, this is coming whether we like it or not. The marketing engine is strong this time around, and Qualcomm isn't to be trifled with.
there's a fix for this. it's called "efficiency testing" or "normalizing for battery size" or even just literally "measuring power usage".Battery life, you should be aware, has a complex relationship to energy efficiency. Things can easily be fudged. One way to give a glowing impression about energy efficiency is to use a large battery but a long battery life in that context obviously won't necessarily indicate a frugal use of power. Actually, that isn't the trick being used by thin and light laptop manufacturers equipping their laptops with Core Ultra chips. The trick they use is to radically reduce performance when the laptop isn't connected to the power cable. So, you do get long battery life with Lunar Lake but you don't get high performance during battery discharge. Nice trick and a very effective deception that makes a computer that isn't especially energy efficient seem as if it is.
You need to carefully read what I say. Qualcomm has just announced and released the Snapdragon 8 Elite (a mobile SoC using Oryon Gen 2 cores on a 3nm TSMC process node). Qualcomm has made big claims about the performance and the energy efficiency of the SoC and early testing confirms that it is both faster (in bursts) and more energy efficient than every current Intel CPU including Lunar Lake processors. I never referred to the older X Elite chip (Gen 1 Oryon cores built on a 4nm TSMC process node) that you seem to think is the state of the art at Qualcomm.According to notebookcheck the X1e 78 consumes over twice the power of the 258v under load, what low power are you talking about?
https://www.notebookcheck.net/Asus-...ay-laptop-with-Intel-Lunar-Lake.892978.0.html
Scroll down to the battery runtime under load graph and youll see it.
Yes, but that doesn't change anything I said. If you run a battery discharge test and battery A discharges faster than battery B (after normalisation) then that is only a meaningful result if there is little discrepancy in processing load/intensity as the battery discharges. The meaningful criterion is unit of workload executed per mW or W of power consumed. I trust you grasp that. With Lunar Lake laptops dropping the intensity of processing (by a lot) while disconnected from power and Snapdragon laptops continuing to operate as per normal without curtailing performance while disconnected from power then there is a big discrepancy in processing intensity between Lunar Lake and Snapdragon laptops as their batteries drain.there's a fix for this. it's called "efficiency testing" or "normalizing for battery size" or even just literally "measuring power usage".
You can't get the context from short quote and require wall of text?If you're gonna quote, quote the whole sentence not a snippet, because the sarcasm there (though admittedly subtle) flew right over your head.
ARM was doing that for many years already, to the point of people getting tired reading the same again and again.But I am pointing out to you guys that an ARM transition in the PC space has a probability of much greater than zero.
So, I reiterate everything I have already said. Do not imagine that your personal disposition on this (which you repeatedly draw attention to in your comments) matters much to those who clearly think differently to you. Also, it should be obvious that Qualcomm, for a very a long time, has been writing all of the board level and kernel driver software necessary to permit the full use of the features of its SoCs in Android smartphones. That has happened in a way that evidently doesn't feel right to you but it has happened.Until Qualcomm starts providing Android drivers for 10 years, I will not support their mobile products either.
Take two fully charged computers with exactly the same battery capacity. Run both machines with some high load workload. The workload must be able to produce some partial results with those results - rendering a predetermined list of scenes or objects at a given resolution and colour settings, say - being written out to memory, for instance. Will the winner of that test always be the machine that lasts the longest on battery? No, not if you understand what you are testing. More on that shortly.Youtube video playback for example can't be cheated, all chips perform the same task. Still lunar lake is winning. Eg.
View: https://youtu.be/CxAMD6i5dVc?t=414
You're aware that this has literally nothing to do with the CPU right?Perceptive reviewers have noted that processing intensity is reduced (by a lot) for Lunar Lake laptops when unconnected to power, viz. they get throttled, and not at all for the X Elite.
But what exactly is your point? That energy efficiency contains more than just work done/power used? Are you saying that total time to complete the work also matters? Because that's certainly the case, although it's more of a scientific metric than one that "end users" actually care about.Take two fully charged computers with exactly the same battery capacity. Run both machines with some high load workload. The workload must be able to produce some partial results with those results - rendering a predetermined list of scenes or objects at a given resolution and colour settings, say - being written out to memory, for instance. Will the winner of that test always be the machine that lasts the longest on battery? No, not if you understand what you are testing. More on that shortly.
Energy efficiency is greatest when a common unit of work consumes the least amount of power (mW, W or charge from batteries of equivalent capacities) but if work is being done at a quicker rate by one of the two computers under test then a relatively shortened battery life will not necessarily imply lesser energy efficiency. The winning computer in the energy efficiency test scenario that I just set out will be the computer that renders more scenes/objects before it's battery dies.
Perceptive reviewers have noted that processing intensity is reduced (by a lot) for Lunar Lake laptops when unconnected to power, viz. they get throttled, and not at all for the X Elite. The video isn't lying but you are making dubious inferences from what you are seeing. We need a sophisticated test to assess energy efficiency in this situation.
Of course, it is possible to throttle your way to efficiency (to remain always within the more optimally efficient part of the power-performance curve for a processor). Doing that though creates another problem - battery life is extended by doing as little as possible and wasting a lot of time.
No, it's not a disadvantage. Lunar Lake competes in a different, albeit overlapping, swimlane than SDXE. LNL is designed for 9-15W devices, whereas SDXE (similar to STX) is designed for 28-65ish watt power envelopes. You've got it flipped, SDXE is not a low-power chip. The fact that they're being directly compared is only because ARL-H and -HX haven't been released on the Intel side. Purwa is maybe comparable to LNL in some scenarios because it's an 8 core design like LNL, but it really is designed for less expensive laptops, not necessarily low power ones.>>Notably, Qualcomm isn't using the top configuration of the Intel Core Ultra Series 2 platform, the Core Ultra 9 288V [in its benchmark comparisons]<<
And, why would Qualcomm do that? Actually, wouldn't it be crazy to do that? Qualcomm is arguably putting itself at a big disadvantage to compare its mobile chip with a high spec Lunar Lake laptop chip. Must Qualcomm's low power mobile chip beat Intel's fastest Lunar Lake laptop chip on every performance and efficiency measure before it is considered worthy of notice?
If they are then they may be the only two of the three to do so in the long term - as Arm has just issued a 60-day notice period of license retirement to QualcommBoth Intel and AMD better be working on ARM offerings, this is coming whether we like it or not. The marketing engine is strong this time around, and Qualcomm isn't to be trifled with.
OMG ?!?If they are then they may be the only two of the three to do so in the long term - as Arm has just issued a 60-day notice period of license retirement to Qualcomm
Legal posturing over royalties.OMG ?!?