Intel Demonstrates Haswell Processors at IDF

  • Thread starter Thread starter Guest
  • Start date Start date
Status
Not open for further replies.
Great that CPU's will use less power, but what about performance gains? The wild rollercoaster rush of performance I saw in my childhood has gone. I feel sad and downcast.
 
"Haswell processors will deliver about two times the performance of Ivy Bridge processors at the same power, while consuming less than half the power of Ivy Bridge processors"

So about the same overall performance can be expected...
 
[citation][nom]menigmand[/nom]Great that CPU's will use less power, but what about performance gains? The wild rollercoaster rush of performance I saw in my childhood has gone. I feel sad and downcast.[/citation]
I'm not seeing much info or claims about large performance gains like we did with Sandy. Aside from a big leap in efficiency and a good boost to the integrated GPU, is this all Haswell will end-up being about? Maybe we can lose the TIM and go back to flux-less solder for the heat spreader as well?!?!
 
[citation][nom]wowzzz[/nom]speed : 2x , tdp : 1/2 x[/citation]
for graphics it is a 2x increase for the same TDP, so the same performance at 1/2TDP of Ivy Bridge, which is highly impressive. But for the CPU performance we are not expecting a huge increase in performance, my shot in the dark guess would be somewhere in the 5-10% range, with some aspects not changing at all, while other aspects get much larger increases.

But think about it. PC sales are down, because for the bulk of users out there we have enough CPU power for that platform. Most people I know are more than happy with the performance of their C2Duo CPUs which are 5+ years old now (provided that they are paired with SSD and dedicated GPU solution). The demand in the market are for better battery life and performance in portable devices, smaller form factors for dedicated use machines (smart tables to vending machines), and lower TDP for fan-less operation in traditional devices.

Now I want the next gen of super computing just as much as anyone, but as a company they are facing a serious threat from ARM, which Haswell and Broadwell will probably remove if things go as well as Intel is hoping. Once the race to the bottom hits a major barier then we will see a new move on the ultra uncompromised performance end of things again. Development has always gone this way; Efficient single core development, to mass parallelism, then getting that more efficient, and then even more parallelism. Right now we are on an efficiency run for the next 2-3 chips (with modest performance increases as well), but in 2-3 years we will see a performance push again.
 
There are still performance gains, usually in the 5%-10% ranges, but more importantly, the whole end experience is improved. They demonstrated Skyrim running on current IVB HD4000 graphics and Haswell's GT3 going at 1080p on high settings. That is a pretty big jump for the bottom end.
 
[citation][nom]wowzzz[/nom]speed : 2x , tdp : 1/2 x[/citation]

You got that wrong. Please creafully read the quote again.

In a product demonstration, executive vice president David Perlmutter showed that Haswell processors will deliver about two times the performance of Ivy Bridge processors at the same power, while consuming less than half the power of Ivy Bridge processors at the same performance level.

So what actually says is that it will have twice the performance with the same current draw as the Ivy Bridge, OR it will have half the current draw with the performance as the Ivy Bridge.

Oh, and TDP = Thermal Design Power, it has absolutely nothing to do with power consumption. It describes how power the cooling system is required to diisipate at maximum current draw.

 
Im supposed to believe a 77w i5 haswell will be 2x faster than a 3570k while using the same 22nm process? i really dont believe this until I see it.
 
[citation][nom]Radic4l[/nom]"Haswell processors will deliver about two times the performance of Ivy Bridge processors at the same power, while consuming less than half the power of Ivy Bridge processors"So about the same overall performance can be expected...[/citation]

I think that here intel talks about performance a little like AMD does: I guess that when they talk about 2 times the performance they are referring to GPU bounded apps...
 
[citation][nom]menigmand[/nom]Great that CPU's will use less power, but what about performance gains? The wild rollercoaster rush of performance I saw in my childhood has gone. I feel sad and downcast.[/citation]because the biggest CPU market now are on mobile. We have gotten into a situation that we have "enough" performance for casual user. Battery life is more important on mobile. Intel is focus where the money goes. Mobile = bigger market share = money
 
nice doublespeak spin by intel... really have to concentrateon in order to decode what they said:

*ZERO appreciable performance improvement
*but hey, it uses less electricity

in other words... another 30 money-milking SKUs that have no value to desktop users

Monopolies suck
 
Great chip, until the computer OEMs get their hands on it! I hope Intel steps up and requires the OEMs to use the Intel provided graphics drivers and not allow the OEMs to customize the HD graphics drivers, then provide no updates! Faster graphics will be of no use if the game requires the latest driver updates and the user can not get the updates from the OEMs!
 
I build new i5-2500/3500 systems for people... they are super fast, impressive. Easily smoke my OLD Q6600 running at its default clock-rate. (I have other bills, social life) But for my needs, it still gets the job. It boots with an X25 SSD in about 35~40 secs (vs 12~16 for an i5-3570). But with Windows8 on my Core2Duo notebook (2.4Ghz/4GB) and 5200RPM HD, it boots in about 15 seconds! Wow!

Too bad the rest of Windows8 sucks.

I plan to finally go it-3570K before Christmas... :) Then my next upgrade will be in 4 years or so.

Win7 runs very good on 4 or even 6 year old tech. More speed is always welcomed - most people won't notice it... gaming is more for consoles. So less heat and wattage = smaller form-factors (tablets I guess) is where the market is going.
 
"menigmand 09/12/2012 3:04 PM

Great that CPU's will use less power, but what about performance gains? The wild rollercoaster rush of performance I saw in my childhood has gone. I feel sad and downcast."

No kidding! I remember every new processor arc. being leaps and bounds faster than the last. Just a few examples; 286 to 386 to 486 to pentium. 68000 to 68020/30 to 040 to PowerPC 601/603/604. A 33mhz 030 might put out about 8 mips, but a 33mhz 040 put out almost ~33mips!

Now-a-days, you can expect a 5-10% gain (if were lucky) at the same clock speed when a new CPU is released. The reason for this? Companies don't change CPU designs anymore, they just "tweak" things. IE, a Core 2 is really a heavily modified Pentium Pro from ~1995. Heavily modified, but still very similar. It took Intel until what?... 2009 or so to come out with the Corei series? And even then, Corei seems to be nothing more than a core2 with an onboard memory controller and even more "tweaks". Hence the nearly identical int performance with HT disabled except for memory transfers and FP.

In short, there is no competition anymore. Intel and AMD dominate, and they both see no reason to change anything, just update 20 year old designs. I will give AMD props with Bulldozer for even trying thou. In short, expect things to get even worse in the future for home computing.

Man, I miss the 80's and 90's :-(
 
[citation][nom]thecolorblue[/nom]nice doublespeak spin by intel... really have to concentrateon in order to decode what they said:*ZERO appreciable performance improvement*but hey, it uses less electricityin other words... another 30 money-milking SKUs that have no value to desktop usersMonopolies suck[/citation]

Assuming that Haswell does use half the power that Sandy does at the same CPU performance, that means that unless it is crap at stability with increased clock frequencies, K editions will not only have huge headroom, but Intel is very likely to release at least some models with CPU performance improvements over some of the Sandy and Ivy Bridge models.

There is no monopoly going on here anyway. AMD is plenty of a presence against LGA 1155 and with AMD making great improvements on a fast and regular schedule now (assuming that they keep to it, of course), the pressure is getting on Intel. Even without resorting to the core configuration tricks that can let FX-81xx/82xx compete with the Sandy and Ivy Bridge i5s and i7s (even the K editions) in overclocking versus overclocking comparisons, AMD competes very well with stock Intel CPUs in most consumer markets.

For gaming, even the few games that are very CPU bound and not very well threaded can be handled just fine on a Phenom II or FX CPU with the CPU frequency and the CPU/NB frequency (often overlooked because Intel doesn't have a comparable setting, this controls AMD's L3 cache frequency and can have a substantial impact on AMD's per core and highly threaded performance), so unless games and other software come out that the current K editions struggle with, Intel's competition problem isn't even AMD not beating the snot out of them, it's that not only is AMD good enough most of the time, but most games and even regular software simply doesn't need more CPU performance than we have available. If Intel wants people to get back into fast CPU upgrade cycles, they not only have to provide CPU performance improvements often enough, but they have to find a way to get incentive for this out.

Intel could try to get with game and software developers to get some things done such as new features and such that can take advantage of CPU performance more than most current software does. Getting things more well-threaded, increasing stuff such as AVX support and widening it or making similarly high-performance instruction set extensions for other purposes, and more.

I can literally take a five-year-old or three/four-year-old Core 2 Quad or Phenom II x4 and do pretty much anything that most consumers do and do it well, even if I need some overclocking to do it. No game is unplayable with such CPUs and other affordable hardware. Some compression, rendering, general productivity, and much more can all be done in reasonable amounts of time. It'll take more than small or even moderate performance improvements to stop this trend.
 
[citation][nom]saturnus[/nom]Oh, and TDP = Thermal Design Power, it has absolutely nothing to do with power consumption. It describes how power the cooling system is required to diisipate at maximum current draw.[/citation]
TDP and electrical power ARE related, albeit not with a 1:1 relationship. The TDP is set as the expected short-term average peak power the chip is expected to dissipate under normal operating conditions and a chip cannot dissipate more power than the amount of electrical power getting inside it. So the TDP does give a reasonably good idea of what the ceiling of electrical power might be.
 
[citation][nom]saturnus[/nom]You got that wrong. Please creafully read the quote again.So what actually says is that it will have twice the performance with the same current draw as the Ivy Bridge, OR it will have half the current draw with the performance as the Ivy Bridge.Oh, and TDP = Thermal Design Power, it has absolutely nothing to do with power consumption. It describes how power the cooling system is required to diisipate at maximum current draw.[/citation]
when you have lower TDP it means you are converting less electric current into heat which means you are wasting less electric current which leads to less power consumption :) like the new lamps they use less current because they produce less heat you can actually touch them but the old ones will burn you cause they waste so much of current into heat
 
Performance gain here is in 'Performance per Watt' not 'Performance per Clock'. This is not about making a more powerful Desktop gaming rig but rather this brings Desktop Performance to Notebooks and other small formfactor devices.
 
[citation][nom]shahrooz[/nom]when you have lower TDP it means you are converting less electric current into heat which means you are wasting less electric current which leads to less power consumption like the new lamps you they use less current because they produce less heat you can actually touch them but the old ones will burn you cause they waste so much of current into heat[/citation]

GTX 580 TDP = 244W
Radeon 6970 TDP = 250W
Real power consumption comparison: GTX 580 uses far more power than the 6970.

GTX 670 TDP = 170W
Radeon 7950 TDP = 200W
The Radeon 7950 is generally reported as using somewhat less power than the 670.

i5-2400 TDP = 95W
FX-4100 and FX-6100 TDP = 95W
The FX CPUs generally use somewhat more power.

TDP is a very very rough estimation of the maximum peak power consumption and how it is estimated it can vary between companies.
 
Sorry but I love Intel and as far as I am concerned they have never let us down and always delivered solid performance and great overclocks so when I see a CPU that will have lower power consumption and less heat with the same performance as the current generation that means bigger and better overclocks, anyone else think this would be the case?

Since heat is the overclockers worst enemy this can only be a good thing.
 
[citation][nom]Justposting12[/nom]In short, there is no competition anymore. Intel and AMD dominate, and they both see no reason to change anything, just update 20 year old designs. I will give AMD props with Bulldozer for even trying thou. In short, expect things to get even worse in the future for home computing.Man, I miss the 80's and 90's :-([/citation] The 80s were the days in which computing was very new. Where having color dots was a big deal.
It was exciting times as I was a kid back then. At age 14, I had a 1Mhz 3K computer (not MB, not GB - 3.5KB of RAM) and a tape drive hooked to a 19" TV (I paid $280 for a FLOPPY DRIVE in 1985). My own 7yr old son has a quad core 3Ghz AMD with 4GB, ATI 4670 video card and a 21" LCD monitor... He does things on his computer that I would not have imagined when I was a teenager. Needless to say, a $400 iPad is dirt cheap in my book.

I don't miss spending money to buy a $1000 2MB ram expansion box. (I still have it, its heavier than my Thinkpad). I don't miss the instability. In the 90s, it was its own kind of bad with Windows 3.x~Win95 and crappy unreliable PC hardware 386~P1... which started getting stable with the ATX form-factor. Its not until XP with SP1 and 2003+ hardware that our systems started becoming rock-solid.

There are still good memories with some old hardware... I still have my C=128, Amiga 1000 & 3000 and even a Mac Classic I was given. My WindowsPC hardware, is not kept... no Nostalgia there... other than some Win98 and ME discs. Windows/PC are a commodity product, they are not built out of love or geek factors anymore (Always to make money, by all means). Hence, even thou Steve Jobs was an asshole - he did have a love for the tech. Microsoft has no heart or personality... Windows7 is the closest thing they ever made that has "something". Apple is losing its personality with law-suits... trying to win market share with BS legal actions, not technology. Its like Ford suing Chevy because the Camero has 4 wheels like the Mustang.

I think Android has some personality because it allows rather easy ways to customize it to your own needs... in ways that iOS and WP does not.

The Desktop computer for home-use is going to fade away... the way of the Radio, land-lines, VHS, etc.

The power of a typical cellphone today exceeds that of a 2003 desktop computer. The only thing some of us will keep is storage... and NAS devices allow that. So some of us will use cloud, others their own servers to provide their own content to their phones, tablets and TVs. How many of us have a wireless keyboard and mouse being used on their HDTVs?

What drove the home PC market for faster hardware (GPU and CPU) was games. With the lack of top-end games for PCs, as most are nothing more than ports from consoles and look no better... means the needs to upgrade PCs goes down. I'll never buy a $500 gaming card, its pretty much pointless except for the few who need 3~6 monitor support.

The future is TINY, light-weight devices that works anywhere in our homes. Using a tablet is about as functional as my desktop... other than typing out stuff like this.
 
[citation][nom]bustapr[/nom]Im supposed to believe a 77w i5 haswell will be 2x faster than a 3570k while using the same 22nm process? i really dont believe this until I see it.[/citation]

No, it will be about as fast as an 3570, but it will use less power... So if you have Sandy or Ivy, you don't need Haswell, but If you are going to buy laptop or making HTPC this may be your best choice!
The Intel says that the general improvement will be low two digit number (From anandtech) so it is not insignificant, but nothing ground braking either. Guite near 10% faster than ivy bridge.
http://www.anandtech.com/show/6277/haswell-up-to-128mb-onpackage-cache-ulv-gpu-performance-estimates

 
Status
Not open for further replies.