Introducing Intel's 14nm Node and the Broadwell Processor

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

sc14s

Reputable
Feb 21, 2014
29
0
4,530
Nope... still not good enough !
at 4.6Ghz my 2700K is more than a capable CPU .
Bring on the Skylake,, then we'll talk .
I'm not sure what you expect from them. Physics is the issue more than anything else.
 

f-14

Distinguished
Its good to see Intel working so hard on their thermal department. Gaming is great, but you cant help feeling guilty about mother earth every time you fire up your pc. Meanwhile Amd innovates with 220W processor XD

my electricity comes from a nuclear plant down the road from me. mother earth will be just fine. earths been here before human beings, and earth will go on after human beings.

"“We’re so self-important. Everybody’s going to save something now. “Save the trees, save the bees, save the whales, save those snails.” And the greatest arrogance of all: save the planet. Save the planet, we don’t even know how to take care of ourselves yet. I’m tired of this sh*ee**t. I’m tired of f*aq**ing Earth Day. I’m tired of these self-righteous environmentalists, these white, bourgeois liberals who think the only thing wrong with this country is that there aren’t enough bicycle paths. People trying to make the world safe for Volvos. Besides, environmentalists don’t give a sh*ee**t about the planet. Not in the abstract they don’t. You know what they’re interested in? A clean place to live. Their own habitat. They’re worried that some day in the future they might be personally inconvenienced. Narrow, unenlightened self-interest doesn’t impress me.

The planet has been through a lot worse than us. Been through earthquakes, volcanoes, plate tectonics, continental drift, solar flares, sun spots, magnetic storms, the magnetic reversal of the poles … hundreds of thousands of years of bombardment by comets and asteroids and meteors, worldwide floods, tidal waves, worldwide fires, erosion, cosmic rays, recurring ice ages … And we think some plastic bags and some aluminum cans are going to make a difference? The planet isn’t going anywhere. WE are!

We’re going away. Pack your s*tuff**, folks. We’re going away. And we won’t leave much of a trace, either. Maybe a little Styrofoam … The planet’ll be here and we’ll be long gone. Just another failed mutation. Just another closed-end biological mistake. An evolutionary cul-de-sac. The planet’ll shake us off like a bad case of fleas.

The planet will be here for a long, long, LONG time after we’re gone, and it will heal itself, it will cleanse itself, ’cause that’s what it does. It’s a self-correcting system. The air and the water will recover, the earth will be renewed. And if it’s true that plastic is not degradable, well, the planet will simply incorporate plastic into a new paradigm: the earth plus plastic. The earth doesn’t share our prejudice toward plastic. Plastic came out of the earth. The earth probably sees plastic as just another one of its children. Could be the only reason the earth allowed us to be spawned from it in the first place. It wanted plastic for itself. Didn’t know how to make it. Needed us. Could be the answer to our age-old egocentric philosophical question, “Why are we here?”

Plastic… as*h**hole.”
? George Carlin
 

Menigmand

Honorable
Jul 27, 2012
128
0
10,680


You forget that when people talk about 'saving the planet', they don't mean saving just the planet.. they implicitly mean "saving the planet ... as a home for human beings".

Who cares if the earth can live on as a radioactive wasteland, home to a million new happy species of cockroaches, we need it to stay good for us humans.

And this stuff is not hippie pipe dreams, it's basic science.

Sorry for going off topic.
 

ceeblueyonder

Distinguished
Nov 6, 2011
69
0
18,630
Its good to see Intel working so hard on their thermal department. Gaming is great, but you cant help feeling guilty about mother earth every time you fire up your pc. Meanwhile Amd innovates with 220W processor XD

my electricity comes from a nuclear plant down the road from me. mother earth will be just fine. earths been here before human beings, and earth will go on after human beings.

"“We’re so self-important. Everybody’s going to save something now. “Save the trees, save the bees, save the whales, save those snails.” And the greatest arrogance of all: save the planet. Save the planet, we don’t even know how to take care of ourselves yet. I’m tired of this sh*ee**t. I’m tired of f*aq**ing Earth Day. I’m tired of these self-righteous environmentalists, these white, bourgeois liberals who think the only thing wrong with this country is that there aren’t enough bicycle paths. People trying to make the world safe for Volvos. Besides, environmentalists don’t give a sh*ee**t about the planet. Not in the abstract they don’t. You know what they’re interested in? A clean place to live. Their own habitat. They’re worried that some day in the future they might be personally inconvenienced. Narrow, unenlightened self-interest doesn’t impress me.

The planet has been through a lot worse than us. Been through earthquakes, volcanoes, plate tectonics, continental drift, solar flares, sun spots, magnetic storms, the magnetic reversal of the poles … hundreds of thousands of years of bombardment by comets and asteroids and meteors, worldwide floods, tidal waves, worldwide fires, erosion, cosmic rays, recurring ice ages … And we think some plastic bags and some aluminum cans are going to make a difference? The planet isn’t going anywhere. WE are!

We’re going away. Pack your s*tuff**, folks. We’re going away. And we won’t leave much of a trace, either. Maybe a little Styrofoam … The planet’ll be here and we’ll be long gone. Just another failed mutation. Just another closed-end biological mistake. An evolutionary cul-de-sac. The planet’ll shake us off like a bad case of fleas.

The planet will be here for a long, long, LONG time after we’re gone, and it will heal itself, it will cleanse itself, ’cause that’s what it does. It’s a self-correcting system. The air and the water will recover, the earth will be renewed. And if it’s true that plastic is not degradable, well, the planet will simply incorporate plastic into a new paradigm: the earth plus plastic. The earth doesn’t share our prejudice toward plastic. Plastic came out of the earth. The earth probably sees plastic as just another one of its children. Could be the only reason the earth allowed us to be spawned from it in the first place. It wanted plastic for itself. Didn’t know how to make it. Needed us. Could be the answer to our age-old egocentric philosophical question, “Why are we here?”

Plastic… as*h**hole.”
? George Carlin

i was rooting for this quote until i realized that it's not funny or even well informed. the earth might heal itself, but actual life will be dead and will probably take another 4 billion yrs before more complex life forms can emerge back again. and then there is also the fact that the earth doesn't live forever, which that quote seems to imply. if the sun goes, the earth goes. anyway... i digress.

i just thought the quote was gonna talk more about the intel chips and this sort of "feel-goody--too-shoo bravado" that it has and compare it to politics of environmentalist, which is laden with politics, money and greed.

a 220 watt fx-9590 is just a proof of concept. the ppl who has it is not "hurting" the environment, anymore than someone with an intel i3 chip.

like, compare intel to a smug environmentalist guy. like intel is some kind of computer tesla. a cpu that is environmentally friendly and can go really fast.

like i thought the quote was gonna address the person's feeling of goodness and talk about how that is exactly what intel wants you to feel.
 

daekar

Distinguished
Apr 7, 2009
83
0
18,630
All of this looks very impressive. What I would like to know is, does this mean my next phone is going to be x86 with a full copy of Windows on it? That's what I've been waiting for since I got my first smartphone.
 

childofthekorn

Honorable
Jan 31, 2013
359
0
10,780


And lack of competition. Materials are already lined up for more durable circuitry at the lower manufacturing process, but currently theres no need to acquire the extra overhead.
 

blppt

Distinguished
Jun 6, 2008
579
104
19,160
"like, compare intel to a smug environmentalist guy. like intel is some kind of computer tesla. a cpu that is environmentally friendly and can go really fast. "

The major difference here is that there are absolutely no drawbacks to the equivalent Intel chip, except the slightly higher initial price, which some of which can be made up over operating time saving on electric bills.

I built an all-AMD box based around the 9590 because it was the fastest AMD desktop chip available, and with the consoles currently using an 8 core setup, I thought perhaps in the future that having 8 physical integer cores (only 4 fpus) would be more advantagous than having 4 physical cores+hyperthreading. Also, I hadn't built an AMD system in a while.

Honestly, this specific CPU has been more trouble than its worth---knowing that its a power-monster that has wrecked cheap motherboards, I invested in a pricey ASUS CHVF-Z and the very highly rated EVGA 1300G2 power supply. Mine shipped with the OEM AMD water-cooler (seems to be an Antec Kuhler H20 clone), which is not adequate to cool this cpu without the fans screaming like a vaccum cleaner. I replaced the fans with the near-silent Corsair SP120 Quiet series, which solved that problem, but my Lord, this thing still puts out heat out the side vents of my case like a blast furnace. Under a heavy load, even with water cooling, temps will be in the 70s.

Even with the considerations above---Turbo rarely (if ever) kicks in, my guess would be that the temps are too high and fall on the outside of the "acceptable turbo thermal specs". I have also only been able to get the CPU stable at 5ghz (all cores, a very mild overclock) by pushing the Vcore to a constant 1.57, which makes things WAYYYY too hot and causing the system to shut down due to heat after a few full-core-load benchmarks.

Even *if* it didnt shut down, the best this setup will do @ 5ghz is similar in benchmarks to an old Z68/2600K I have as well. Geekbench 2.1.7 (64-bit), for one example, gives me a score of 13100 with the 2600K, and the near-meltdown 9590 @ 5ghz gives me 13030.

You take that all into consideration, along with the 2600K's 95W TDP, and the 9590 (which due to me bumping the stock Vcore to get it "stable" @ 5ghz) now exceeding its 220W TDP, and it makes absolutely no sense for somebody to buy the 9590, unless you feel the need to root for the underdog.
 

InvalidError

Titan
Moderator

Discovering new materials is one thing. Turning them into a cost-effective alternative is an entirely different ballgame. Tons of new materials have been discovered over time but many never made it into commercial products because no cost-effective method of manufacturing or using them was ever found.

This is a bit like OLED: the polymers themselves are cheap and easy to manufacture but no manufacturer has managed to find a reliable and cost-effective method of turning them into panels yet so OLEDs are still stuck with a huge premium on them.
 

I'd turn this around. There are a very few games that are beginning to tax even the most capable CPUs. Whether it might be due to sloppy coding is irrelevant; they are taxing.
There are NO mainstream "Office" type programs or anything else that can't run well on an older C2D or Athlon II (C2Q or Phenom II if you're also loaded down with layers of networking and security software). Since the money is in the volume part of the business, Intel has no financial reason to innovate further. They have already established performance bragging rights over AMD, so even that reason is pretty much gone. I wouldn't expect them to simply coast (a leap-frog, however unlikely, is still a possibility they won't ignore), but especially with the state of consoles, I would not expect significant improvement at the high end. I'd expect Intel to target Kabini though, as running office applications AND getting decent graphics for only 25W is a potentially huge market. A lot might depend on pricing though, where AMD still has advantages. Intel has had an i3 at 35W for a while now, but it's over $132, vs. $65 for Kabini.
 

childofthekorn

Honorable
Jan 31, 2013
359
0
10,780


I'd agree. The 9590 was a wreck. Although I can say my 8320 is running just fine at stock (or even sligthly overclocked to 8350 specs). These chips, although ghz rise, don't get much for having higher clock speeds. They've found that higher clock speeds actually hurt the FX line in productivity. You just bit into the marketing.
 

childofthekorn

Honorable
Jan 31, 2013
359
0
10,780


Can't wait for (the rumored specs of) OLED!! But I agree that the cost effectivness may be a ways out, but what spurs this? If competition were to rise to get them scrambling for the next best then they'd fund their way into better production methods. Right now they're just kickin up their feet with a nice glass of <insert fav drink> and saying "Eh, wheneva!"
 

InvalidError

Titan
Moderator

Intel is not exactly doing nothing since they are spending billions on process research and production equipment upgrades each year.

The bigger problem is that there are almost no mainstream uses for any more processing power than the mainstream already has so Intel focuses all their efforts on reducing power while AMD focuses their efforts in trying to catch up or break into markets Intel is under-serving.

If a killer, heavily threaded mainstream application came around that made even enthusiast CPUs crawl on their knees came around, AMD and Intel could easily increase computing power by trading IGP die area for extra CPU cores.

But in the current market dominated by single-threaded programs, adding cores in mainstream CPUs would be a wasted effort.
 

patrickjp93

Reputable
Jun 1, 2014
26
0
4,530
While better efficiency is nice and all I fear intel won't do enough for gamers to warrant an upgrade of their cpu. When overclocked Haswell doesn't do much above Sandy bridge and while intel may not have the strongest competition from amd on the higher end anymore if people won't upgrade their cpu's it will hurt them on the longer run.

You made the mistake of assuming Intel cares about the gaming market at all (it doesn't). It's too small compared to their corporate and server chip markets, and it's a small subset of the overall consumer market.

Intel wants to stay ahead of ARM for servers and supercomputing, hence all the upgrades to floating point performance and the focus on power efficiency. Intel could care less if gamers upgrade or not. That's not where its big profits come from. Pentium, Celeron, and I3 are the big money markets for corporate users, and I7-E/Xeon are the big money makers for server/supercomputer markets. I5 and I7 are simply toys for a small subset of their overall market targets.
 

patrickjp93

Reputable
Jun 1, 2014
26
0
4,530

The i7-2600k wins most benchmarks by a 10-20% margin and quite a few by a more substantial 30-50% lead. The only benches AMD wins by a significant margin (~15%) are 7zip and 2nd-pass h264.
http://www.anandtech.com/bench/product/697?vs=287

To make the FX a more even match for the stock i7, it needs at least an extra 600MHz.

i did acknowledge that "if the fx-8350 is behind, it isn't behind by much." 10-20% is not much to me. it's not what you considered in your earlier post as intel having an architectural advancement b/c if you wanna speak architecture, AMD patented x64. x64 is better than x86 which intel uses. correct me if i'm wrong.

Facepalm! x86 is the overall ISA of Intel and AMD. It's the basic instruction set both carry (16-bit and 32-bit. AMD beat intel to market with their version of x64, though there are a lot of design flaws which at some later date will need to be removed). Both companies create instruction set extensions to go on top of the basic x86 package, but AMD CPUs and Intel CPUs are both x86 architecture, named for the 8602 processor from which it evolved.
 

patrickjp93

Reputable
Jun 1, 2014
26
0
4,530
"*reads about people complaining not having a reason to upgrade and spend more money* what am I reading "

As a gamer, im tired of games stagnating. The long counsel cycle deserves a lot of the blame. But stagnation of hardware is just as guilty. Nothing has really changed in the gaming world in the last 7 or so years. Graphics havent really improved any, nor has ai, or anything else. Sure there have been some tiny improvements, but its all pretty boring.

If processors were still doubling in performance every 18 months or so....the games of today would make the current stuff look like old school nintendo games.

Ive been waiting for another spurt of innovation since 2009. Yet my 6 year old system, still plays every new game just fine. So why upgrade.... Wish intel/amd would give me a reason to.

I wish someone would blow me away again with a hardware advancement.

That has to do with the programmers. Graphics have improved a lot. Fire, water, grass, hair, and shadows are not cheap to get right (in the mathematical sense). AI has improved, but it's a PSPACE problem. You need infinite cores/memory to get infinitely good AI. It's not like there is going to be a polynomial-time scaling AI algorithm that can be used for advanced decision making. If you want to learn more, look up the neural network data structure.
 

patrickjp93

Reputable
Jun 1, 2014
26
0
4,530

If you look at the pattern for the last couple of chips, improvements are around 5-7% regardless of Tick or Tock so I would expect up to 7% IPC improvement from Skylake since that is what we got from IB to Haswell.

That would be a fallacy. Skylake is Intel's Kaveri, going heterogeneous. Software of course must catch up and be programmed to take advantage, but the reality is there is a lot of room for improving performance via the GPU cores on Haswell, Broadwell, and Skylake iGPUs. Programmers just haven't caught up. As one in college who gets to play with this sort of thing, I'm shocked at how people complain about a 5% (overall) ipc improvement, when the reality is for scientific computing that 5-cycle to 3 can yield 25% or more improvement. If you put the GPU cores into the equation, which will also be based on Intel's industry-leading FPU (yes it's better than anything nvidia or AMD can dish out), then it's an 80% improvement if the programmers are up to it.
 

patrickjp93

Reputable
Jun 1, 2014
26
0
4,530

Intel is not exactly doing nothing since they are spending billions on process research and production equipment upgrades each year.

The bigger problem is that there are almost no mainstream uses for any more processing power than the mainstream already has so Intel focuses all their efforts on reducing power while AMD focuses their efforts in trying to catch up or break into markets Intel is under-serving.

If a killer, heavily threaded mainstream application came around that made even enthusiast CPUs crawl on their knees came around, AMD and Intel could easily increase computing power by trading IGP die area for extra CPU cores.

But in the current market dominated by single-threaded programs, adding cores in mainstream CPUs would be a wasted effort.

More importantly, if programs came along which leveraged iGPU cores to their limits. These things are built for floating point math, but anything that reduces to matrix algebra can be accelerated on iGPUs easily. Other things are more difficult, but it can be done.

Until programs leverage everything out of AVX 2, SSE, and the iGPU, the only thing Intel has any motivation to significantly improve is the iGPU for gaming and scientific computing (so that HSA doesn't steal some of the market Intel built its SIMD instruction sets (MMX, AVX, SSE) for.
 

InvalidError

Titan
Moderator

Most people do not do scientific computing in their everyday life and shrinking the integer multiply from 5 cycles to 3 cycles makes no difference performance-wise since the operation is pipelined and gives you one result per cycle regardless of how many clock ticks it takes. Unless your scientific application makes a conditional branch or random memory access based on that multiplication all the time, multiply latency should not matter much.

Most scientific algorithms I know of tend to apply multiply-add operations to large arrays and matrices before applying any logic. The multiply latency hit would be something that matters once per thousands if not millions of multiply operations in properly optimized code so the shorter latency would improve performance of those algorithms by less than 0.1%.
 

Wamphryi

Distinguished
Its good to see Intel working so hard on their thermal department. Gaming is great, but you cant help feeling guilty about mother earth every time you fire up your pc. Meanwhile Amd innovates with 220W processor XD

Its the manufacturing and disposal requirements of computers that really impact on the environment. Home builders make the best use of the components they buy and when your gaming, your not driving which was the basis of much entertainment pre PC. Your conscious should not be burdened so.
 

Duckhunt

Honorable
Sep 22, 2012
339
0
10,810
"*reads about people complaining not having a reason to upgrade and spend more money* what am I reading "

As a gamer, im tired of games stagnating. The long counsel cycle deserves a lot of the blame. But stagnation of hardware is just as guilty. Nothing has really changed in the gaming world in the last 7 or so years. Graphics havent really improved any, nor has ai, or anything else. Sure there have been some tiny improvements, but its all pretty boring.

If processors were still doubling in performance every 18 months or so....the games of today would make the current stuff look like old school nintendo games.

Ive been waiting for another spurt of innovation since 2009. Yet my 6 year old system, still plays every new game just fine. So why upgrade.... Wish intel/amd would give me a reason to.

I wish someone would blow me away again with a hardware advancement.

That has to do with the programmers. Graphics have improved a lot. Fire, water, grass, hair, and shadows are not cheap to get right (in the mathematical sense). AI has improved, but it's a PSPACE problem. You need infinite cores/memory to get infinitely good AI. It's not like there is going to be a polynomial-time scaling AI algorithm that can be used for advanced decision making. If you want to learn more, look up the neural network data structure.

Again the big problem is the software is way behind the hardware but when hardware boys won't train the software boys then what do you expect. We need accessories to take off. I read this recently on VR..

Sixense shows off an incredible VR shooting gallery demo, showing off the abilities of its STEM System. This is what we need on the pc. It will help pc sales. Id buy it if it was less then 1k.

 

InvalidError

Titan
Moderator

All the training in the world does you no good when tons of common everyday algorithms simply cannot be threaded in anything resembling an efficient way due to inter-dependencies and context sensitivity. Ex.: you cannot parse parts of a C-file out-of-order since the order in which #include, #define and other stuff appears affects the outcome. Those types of algorithms are intrinsically non-threadable. (Yes, you could thread it for fun anyway but it will never be as fast as the single-threaded version.)
 

wingofword

Reputable
Aug 19, 2014
2
0
4,510
I can't believe Tom's Hardware doesn't know how to do percentages, but here we go:

"In fact, the Broadwell-Y die has about 63% less area than the Haswell-Y die." (page 1)

"The Broadwell-Y chip is 82mm2, scaled down about 63% compared to Haswell-Y's 130mm2 die size." (page 2)

No. It's scaled down 37%. It has 37% less area. So, the new chip is 63% of the original size.


Here are some corrections that I think need to be made. Just doing my part as a community member. :)

The Broadwell-Y chip is 82mm2, scaled down about 63% compared to Haswell-Y's 130mm2 die size.
I think it should say "...scaled down to about 63% of Haswell-Y's 130mm2 die size." The reason being that the original statement seems to imply that Broadwell-Y shrunk by 63% which is a significantly larger shrink as opposed to the about 37% it really shrunk by.
(old - new) / old * 100% = shrink % e.g. (82 - 130) / 130 * 100% = 37%

...Haswell-Y's integrated graphics has a maximum of 20 AUs...
I could be mistaken or behind the times, but shouldn't that by "Execution Units" or "EUs"?

I am pretty sure the article is correct

You are dealing with 2 dimension calculation,
so assume dies are squares, area shrink would be approx.:
(sqrt(130) - sqrt(82) )^2 + (((sqrt(130) - sqrt(82) ) x sqrt(82)) x2)
= (11.4018 - 9.0554)^2 + (((11.4018 - 9.0554) x 9.0554) x2)
= 2.3464^2 + ((2.3464 x 9.0554) x2)
= 48.0008mm2 lost

(130 - 48.0008) / 130 = 0.6308 = 63.08%

so you do lose 63% surface area going from 130mm2 to 82mm2

I think my math is right (been a while since I done geometry)
Please correct me if I am wrong.
 
Status
Not open for further replies.