• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

Intel: Haswell Provides 50% More Battery Life in Notebooks

Status
Not open for further replies.
I suspect strongly that the difference will be more noticeable in cell phones. I doubt the notebooks are going to be effected that much. And for myself my notebook will still need a GPU, because I'm not using Intel's. I'm also a little suspicious of this claim until we see real world testing. More likely than not Intel means 50 percent less energy, "sometimes" rather than all the time (or rather than a gross savings of 50 percent which would require more than 50 percent savings some of the time).
 
I don't care what the idle power-saving properties of a processor are. I care about overall in-use power savings. Most processors are already as power-efficient as need be for idle states. But users care about usage states. Claims that the CPU will have THAT big of an impact are...awaiting confirmation...

My smartphone lasts days and days without using the screen. But start using it for things like web browsing and media-watching, and you're lucky to get a day's use thanks to the screen.
 


Right. There won't be a 50% increase in battery life because, as the article mentioned, the CPU is only one component. Still, it's a pretty impressive achievement, even if the CPU-power savings only translate into a 10-15% increase in battery life overall.
 

CCFL has been out of fashion for years already. The majority of laptops across all price segments have switched to LED over a year ago.

But backlighting itself is inherently inefficient since you generate white light but then scrap 2/3 of the light by passing it through RGB filters for each individual subpixel and then waste 50-100% of the remaining light by blocking it with a polarized crystal matrix. So, while the WLED backlight may be twice as efficient as CCFL, about 90% of the light produced is wasted on average. That's not counting losses within the backlight diffuser itself, bleeding around edges, coupling losses between LEDs and diffuser, etc.

If you want to increase efficiency, you have to ditch backlighting altogether and use emissive technologies like OLED then work on improving those technologies' efficiency, brightness and durability.
 
Let's see now:
50% more battery life in standby
70% faster integrated GPU (maybe 100%)
5% faster CPU performance
5% better battery life while on-use.
At least from a performance perspective, it doesn't look like AMD will have too much of a problem catching up, CPU or APU-wise.
 

There is a fundamental problem with that: battery chemistries have intrinsic limits so unless you can break the laws of physics, you cannot go beyond that. Also keep in mind that higher power densities usually call for more potent reactants which are more likely to spontaneously ignite or explode so this rules out a whole class of high-energy chemistries.

In other words, do not expect cost-efficient intrinsically safe batteries to get much better than they already are.

Also, even if you had an infinite-power battery, you would still want a highly power-efficient APU in your tablet - I doubt anybody would want a scorching-hot tablet from 200W getting heatsinked into its chassis... even the Tegra3's 3-4W in my Nexus7 is already enough to make the SoC area uncomfortably hot for my taste when playing games for more than a few minutes at a time.

While more battery power may be nice, I would be far more interested in reducing power draw and associated temperature gradients across the frame.
 

What kind of kool-aid are you drinking ?

 
They're an attempt to throw the PC sector a lifeline as it struggles against the more power-efficient, more mobile and lightweight tablet and smartphone segments.
Yes, more power-efficient as long as you're not doing much, more mobile and lightweight but will give you a sore neck if you do the simplest of productivity tasks for too long.
 
Status
Not open for further replies.