Intel Broadwell CPUs to Arrive Later This Year

Status
Not open for further replies.
Ahhh, 14nm. I know it's just Moore's Law and all that. But, having been building computers for fifteen or so years by now, the shrink still blows my mind a bit. Earliest I remember is working with a 350nm Pentium II. I'll be excited to see what the next 15 years has to offer once we've shrunk beyond the limits of usability.
 
I'm still rocking a Sandy Bridge i7, I haven't seen any reason to upgrade yet. I'm hoping this will be enough of a reason too. But with PCI-E 3.0 and DDR4 coming maybe... :)
 
If you can wait, wait for Skylake (Successor to Broadwell) which will include DDR4, PCI Express 4, Thunderbolt 3, and Octacore processors.
 
Has there been much stated about how much an improvement there is with the IGP on the chip over HD4600? If it's significant, it could be very nice for HTPC/Steam boxes.
 
I think Z87 was the last good platform for a little while now. Z97 is a very marginal update; the new DDR4 memory interface isn't fully matured yet, and we really need to find a solution to the storage revolution that's occurring. Storage options are very cumbersome with Z97 and SSDs are going to oversaturate what the interfaces are capable of rather quickly. There's just too many devices requiring too much bandwidth all of a sudden. Z97 is a platform featuring many new technologies in their infancy, whereas Z87 was a fully matured platform. So I think if one is looking to build a new mid-range or high-end computer, you may want to hold off another year or two.
 
I think Z87 was the last good platform for a little while now. Z97 is a very marginal update; the new DDR4 memory interface isn't fully matured yet, and we really need to find a solution to the storage revolution that's occurring. Storage options are very cumbersome with Z97 and SSDs are going to oversaturate what the interfaces are capable of rather quickly. There's just too many devices requiring too much bandwidth all of a sudden. Z97 is a platform featuring many new technologies in their infancy, whereas Z87 was a fully matured platform. So I think if one is looking to build a new mid-range or high-end computer, you may want to hold off another year or two.
Yes agreed, I upgraded to Z87/Haswell around last black friday, not going to need to upgrade for a while.
 
Yes agreed, I upgraded to Z87/Haswell around last black friday, not going to need to upgrade for a while.

Same here. I picked up a 4770k for $200 from my local shop. Incredible deal. And yeah, I'm not going to be needing a whole new build for at least three or four years. Maybe an upgrade here and there, a new graphics card if I decide to go 4K. But other than that, I'm really set for a long time.
 

IIRC, Broadwell is supposed to bring GT3/GT3e (HD5xxx) availability across most of the lineup, which should make its IGP about twice as fast as HD4xxx parts.

For HTPC, even a 6+ years old Core2Duo can handle multiple HD/h264 streams in full-software decode so Broadwell would be a "little" overkill for that.

For a steambox or other lightweight/low-power gaming/3D applications, GT3/3e becoming the baseline IGP would help a fair bit but this won't be happening across the board until Skylake unless Intel changes their plans.
 


Yeah, I am looking at building mini-pc w/o a dedicated card using the Antec 110 ISK. I've been trying to decide if I should wait for the AMD a8-7600, Intel Broadwell, or a current Haswell running HD4600 if Broadwell isn't much of an improvement in power/graphics. It will be for HTPC, emulation, and some steam gaming capabilities as it will be used strictly on my TV.
 
As long as they keep successfully using die shrinks to either drive costs and power consumption down or increase speeds without increasing cost, it's a win. I'm not going to jump on the naysayer bandwagon because a new chip is "only" 15% faster at the same price. Meanwhile, if and when the graphics guys ever manage to transition to a new node they're going to be charging you /more/ money for less performance because they know they've got to stretch the node for three to seven more years.
 
Hmm... I am going back to school this year and was going to get a new laptop because I thought that Broadwell was not going to hit until next year. But I can probably live with my 5 year old netbook for a few months into the school year for the sake of a better machine. Broadwell may not offer much for the desktop, but it looks like it is going to be a big deal for horse power and lower TDP for laptops... plus Intel graphics make pretty big strides forward with each generation. I would be lying if I said that I was not going to load up a game or two on my school laptop, but I don't exactly want to pay for a laptop with a dedicated GPU either.
 

The other foundries (UMC, TSMC, GF, etc.) are almost three years behind Intel process-wise and for AMD/Nvidia/etc.'s sakes, they probably cannot afford falling much further behind than that - matching Intel on performance/watt is going to become extremely difficult if foundries slip a whole two process nodes (4-5 years) behind Intel and Samsung.

Since Samsung and GF decided to start doing "copy-smart" to help ramp up 14nm last month, there is a chance GF might move up to only being a year behind Intel instead of slipping further behind.
 
Ahhh, 14nm. I know it's just Moore's Law and all that. But, having been building computers for fifteen or so years by now, the shrink still blows my mind a bit. Earliest I remember is working with a 350nm Pentium II. I'll be excited to see what the next 15 years has to offer once we've shrunk beyond the limits of usability.
Same here; I remember helping my dad build the 'ol Pentium II system... and then attempting to do video editing on it for school projects, which was painful. My first personal build was a Coppermine Pentium III which had the 180mm die shrink and I remember being amazed at how small it was in comparison.
I think they have another 2-3 die shrinks before they hit the wall, and then we are going to see major changes in the materials used to squeeze another 2-3 die shrinks before they are going to have to start implementing new instruction sets and architectures to get further efficiencies. It is going to be pretty cool to see, but once we start making major changes to architecture and instructions then we are going to have to say good-bye to legacy applications that have built up over the last 20 years, and that will be a little sad to see.
 

Intel has already tried going with a "more efficient" instruction set on Itanium with tons of branch predication and other neat stuff that was supposed to enhance performance and scalability yet that failed to scale beyond x86's performance.

ARM, Power, Sparc and other ISAs are also failing to outclass x86 on raw performance and power-efficiency in many situations. As kludgy as x86 might be, Intel has managed to bring it on par with the best of anything else available today with things like uOPS cache to almost eliminate complex instruction performance hits and complexity (who would have thought ~2GHz dual-core x86 CPUs could be squeezed into 2-3W power budgets only a few years ago?) so it seems unlikely the industry is going to give it up any time soon - too much hassle for little to no gain.

Intel's biggest challenge/shortcoming for SoCs is the IGP. Bump that up a notch or two and Intel would have serious contenders across the board.
 
In 15 years we'll most likely be using quantum computers. Intel has already started that they won't be able to surpass 5nm process. That's the absolute minimum according to them.
 
If you can wait, wait for Skylake (Successor to Broadwell) which will include DDR4, PCI Express 4, Thunderbolt 3, and Octacore processors.

Yeah, you're talking on the Enthusiast chips. Probably not until 2016. Some of us who haven't upgraded their mobo/CPU/RAM since 2007 can't wait that long. Going for Haswell-E, We barely need PCI-e 3.0, now anyway, and how long has it been around? Waiting another 2-3 years while software is written to demand that much GPU won't kill anyone. I don't think we've even come close to hitting a ceiling with PCI-e 3.0, have we?
 


How can you say TSMC is 3yrs behind when they will ship A8's shortly for iphone6@20nm? If my 20nm is out before your 14nm at worst I'm ~2yrs behind and they have 14nm on tap for volume Q1 2016 (though I'd say Q2). If Intel's coming this Oct/Nov with devices (they said they'd miss back to school) and TSMC is looking at somewhere in 1h2016 for 16nm again that's under 2yrs.
http://www.eetimes.com/document.asp?doc_id=1319679&page_number=8
20nm socs 3q14 for phones and tablets.

http://www.digitimes.com/news/a20140311PD203.html
Either they are completely lying or 20% of their revenue is 20nm this year in Q4. They are ramping and ahead of schedule already (fixed yields).

Samsung's A8 is coming a little later, so I'm not seeing your points. Don't get me wrong, I think samsung wins in the end if financials don't change for Intel/TSMC so they can keep up with $30B that samsung makes but TSMC appears to be in front on 20nm. You can't win as Intel or TSMC when samsung is spending 22B and Intel 11B while TSMC spent 9.7B (upping it this year to 11-12B IIRC, so tie ballgame for TSMC). Unless Intel figures out how to stop samsung from selling so many phones/tablets they are screwed most likely in 5yrs. If Samsung continues in 5yrs they will have spent $100B on fabs to Intel's 50B. Intel fabs are dead if they don't change the game here in some way that matters.

http://www.dailytech.com/TSMC+Were+Far+Superior+to+Intel+and+Samsung+as+a+Partner+Fab/article34148.htm
"TSMC is starting its first 20 nm mass production this quarter, which will put it ahead of Intel -- if only briefly."
"So arguably TSMC is about a year behind Intel in process, at present, and Samsung is a year behind TSMC. Globalfoundries, a fourth major player, is thought to be a little behind Samsung."

So you say 3yrs, Dailytech, eetimes, digitimes etc think they are NOWHERE near 3yrs behind and samsung is behind tsmc. Everybody seems to agree but you. What is it you know that they do not? Is intel selling 40mil phones? Millions of top tablets behind our backs? NO. There is no process lead here. They are equal or you'd be winning something. I hate to agree with J. Mick, but this time he's not crazy 😉

You have too much faith in (love for?) intel. Certainly as an AMERICAN fanboy (not intel, more AMD old time fanboy here but management has been killing them for a decade) I'd rather see an american company squash samsung (and TSMC) but the spending facts don't lie so no point in ignoring the reality here for me. IBM might look like they stepped away from the gang, but the R&D that is needed for most of their part is done for 20/14nm. Also they are still collaborating on below this, though probably until IBM dumps it all. They haven't been fabbing tons anyway; IBM does R&D then passes it to the other two to let them flesh it all out (they do not fab much themselves). Of course also as an american I'd rather see samsung kick the crap out of TSMC instead of see TSMC reach the top. I don't see how TSMC wins as they totally depend on the fab, where samsung has other devices to sell (phones, tablets, glass, memory, ssd's, tvs etc etc). This will be a war like google/amazon driving down phones (or killing an OS/DirectX in google vs. MS war) because they have ways to bleed you to death on the hardware while OTHER stuff pays the bills (books, movies etc on amazon, ads etc for google).

The most important point to me about Intel? For all the love of their process stuff from people like you (and me years ago), what has it gotten them? IF they are so far ahead, why are they not getting squat in phones and tablets? They are having to PAY to get into devices. They are promising to make up the cost difference on an ARM chip vs. Baytrail etc if you go their way (funny, I thought that was anti-competitive). They are essentially making nothing to get into a device (selling at ARM pricing instead of INTEL pricing). How good is a process if you are NOT the leaders because of it? It's 28nm vs 22nm and you gained nothing even with finfet (when they get it what then? Even that little help is gone). It's going to be 20nm vs. 14nm soon and again you'll gain nothing from it. I predict we will see the same things on the new processes from both sides. Intel will again have to buy their way into stuff unless magically 20nm fails for samsung/tsmc shortly (they are ramping already at tsmc for iphione6 with far better yields now so no magic will stop this).

I'm not alone thinking the above:
http://www.eetimes.com/document.asp?doc_id=1322263
JP Morgan - QUIT MOBILE
"We continue to believe Intel will lose money and not gain material EPS from tablets or smartphones"
Proof they are right so far:
"The mobile and communications group saw a $3.1 billion operating loss in 2013, with 1Q 2014 losses hitting $929 million and revenues at $156 million."

So 3.1B last year, and based on 1Q14, looks like you're ramping to a 4B loss this year right?

Intel's dumb comments in response:
"We feel that we have a plan"
"We’re actually feeling pretty good"
ROFL. Sounds confident. I'd prefer "we will dominate because of X and this is how and why they will suck compared to us" - something like that. They are a gorilla trying to thump the chest without arms (pun intended)...LOL.
and worse:

“Keep in mind we are also manufacturing these chips now at 22 nm, and we are in the process of starting up our 14 nm process.”

Umm...OK, and everyone else is doing this (to use Intel's own words):
“Keep in mind we are also manufacturing these chips now at 28 nm, and we are in the process of starting up our 20 nm process.” and will beat Intel to our new process...

See how that works? You're getting nowhere. Time to buy NV so you can get into the ARM game for real. Producing their chips on Intel's process WILL make a difference. Better mobile design+your process=WIN The definition of insanity is doing the same thing over and over and expecting different results right? :) Buy NV for a REAL game changer. Based on the $4B they will lose this year if you go 5-6yrs of that you could have bought NV today and drove TSMC/Samsung's fabs into a painful existence. IF they keep this up until 7nm etc you gain nothing. Not to mention you can fill your fabs with 550nm gpus instead of delaying upgrades to 14nm fabs. The game changer here is buying NV and producing their stuff at Intel fabs. Intel is good enough to look like they're in the game, but not good enough to take ARM out without BEING ARM (tegra K1 etc). Pay Jen Hsun 3B to either walk away or run the SOC/GPU depts (CTO or some decent title) and buy them for another 22B. In 6yrs at these losses it's basically FREE and the damage you can do for the next 10yrs will destroy fab competition as all others would REALLY be behind by 2yrs+ forever with the same thing we have now. Only then it would be ARM who was trying to win via the definition of insanity 😉 Intel would have the lowest power gpus for ages with the best perf, lowest watt/best perf socs, best cpus and TWO modem solutions (software and hardware versions) and doing so gives you a reason to upgrade fabs to 14nm instead of delaying them because you can't keep them full. Instead intel seems to think you can throw more money at it just like the govt...LOL.

I'd say buy NV or AMD if AMD had a few socs out already but they just don't so you have to go with #1 gpu (their weak link forever) and a proven soc history with a desktop gpu in them now. Anything less than this is a failure that screws your company and shareholders out of $3-4B a year. Once 64bit models on ARM's side hit the desktops I'd bet money someone on ARM's side will decide to "vertically integrate" more and put out a DISCRETE GPU to cut out AMD/NV from being in their 500w ARM PC's. That is a no brainer. They won't want to support their mortal enemies' bottom lines with GPU sales who's profits will ultimately be used against them (AMD isn't an enemy yet, but will be the second their mobile socs hit). INTEL is NOT ahead. $1B Q1 loss purely on mobile doesn't lie.
 
Technically, Intel's 14nm started production last year but ran into show-stopper complications and the schedule ended up slipping by over half a year.

BTW, the gate width in Intel's 22nm tri-gate/FinFET process is 8nm... so Intel has technically been shipping sub-10nm chips for nearly three years already.
 


ROFL...Thanks for making me feel better. I thought I was alone with a 2007 cpu. It's from late 2007 but it's 2007 but easily hits 3.6ghz when desired so I'm barely surviving here. I used to upgrade the cpu or gpu yearly (one each year usually just rotating the purchase), but today I replace board/mem/cpu once per cycle and only buy gpus every 2-3yrs. but I skipped an extra gen this time trying to get to 20nm gpus. That likely wouldn't have happened if I was gaming a lot, but I've had IT crap to do so not enough time to game to justify the purchase. Broadwell+Maxwell+shield2 (maybe 3...LOL)+ a 13in+ 1080p tablet with K1 or M1 (basically for training vids in bed/couch or gaming only) + 1600P Gysnc 27-30in. Come on people get some crap out I REALLY want to buy :) I'm tired of waiting for awesome upgrades. 🙁

I'm tired of buying 3TB HD's every other month to make me a little happy...LOL. Up to 6 now and still have space problems.. :) Where is my 5TB-6TB helium drive? 🙁 Argh!
 
BTW, the gate width in Intel's 22nm tri-gate/FinFET process is 8nm... so Intel has technically been shipping sub-10nm chips for nearly three years already.

That feature is not gate width. 8 nm is the width of one "Fin" at mid height. Since fins have a triangular crossection that means they are wider at the base. As far as I know a single transistor is implemented using several of those fins.

Once upon a time, the node size was defined as half the metal pitch (1rst level metal), what defines a new node is increasingly unclear.
Intel 22nm metal pitch – 64nm
TSMC 28nm metal pitch – 64nm
Acording to "traditional" definition both are 32nm process, however metal has not been scaling well for the last 2 or 3 node shrinks. That effectively means that moore's law scaling is already broken since around 2010. Each new node anounced is delivering less than the expected 2x transistor density, at least if you want those transistors interconected 🙂
 


BTW that has nothing to do with the data in my previous post 😉 You're still touting their process but not understanding they are getting killed in mobile. x86 just can't take out ARM so Intel needs to buy NV so they instantly get ARM to make on their process, not to mention GPUS that take up 5-6x a socs space, thus filling a 14nm fab between the two which would get them back to growth instead of losses in mobile.

Regardless, my points still stand. They still can't get into a phone or tablet without bribing someone. The data in my post doesn't lie. A ten year financial summary shows Intel peaked in 2011 (12.94B profits, but TTM only 9.62 for 2013 down from 11B 2012, sliding each year) and the party has been over since. Intel could be putting out 1nm for all I care. IF they were losing 3.1B a year on it (moving to 4B this year) I'd say their 1nm is getting it's ARSE handed to it by TSMC 28nm.

Intel is behind if it's losing 3.1B last year and on schedule to lose another 4B trying to bribe others to use baytrail etc. Let me know when intel starts making money instead of losing billions in mobile. The day you can do that my post has been refuted :) Gate size, etc etc mean nothing. Making money means everything and it appears Intel is having problems making MORE of it (stuck treading water for . The others are piling up billions in profits on mobile in fabs, while Intel keeps losing it. That's called losing right? I hope they are in talks to pay Jen Hsun whatever he wants so they can get busy making stuff that can take down samsung/tsmc/gf/Qcom/Arm. Well, pretty much qcom...LOL arm makes nothing ~650mil, which I really couldn't believe when I investigated the stock and decided to pass.

You don't see Samsung saying they are halting the build of a fab. TSMC isn't saying it either, rather they are booked all the way to end of 2014. Meanwhile Intel has a fab empty here in the state of AZ because it would just lose money if they opened it. Fabs down at Intel, profits down, losing 21% of notebooks to chromebooks, next stop desktop then they'll be talking the fabs that ARE open losing money as arm cuts off the need to produce as many chips as they make already at Intel. It will only get worse unless they change the dynamics of the situation. That means fabbing arm, and the only way to do that is to make it themselves or buy NV (can't buy samsung, apple or qcom). If they attempt it themselves it would take too long and samsung/apple will be making 40B & 50B by then not to mention Intel would be racking up losses in mobile until the chip came out. They can't really halt the chase without looking like losers while making a chip.

The fastest route to victory is an Nvidia purchase. Then again Jen Hsun may have already done the math and has no need to be bought, figuring he'll win in the end as he assaults Intel's desktops and servers with Denver/Boulder (or whatever comes after them), attacks Wintel gaming via Android/linux etc, etc. How many chips LESS will intel need when they take 20% of desktops by next xmas? Will Intel be losing $5-6B on mobile in 2015? It just takes longer for him to win the ARM war without Intel's fabs. With Intel the gpus will take over ARM/Android even faster, but Jen probably wants to kill Intel more than he wants to speed up the ARM war. I doubt he thinks he'll lose a GPU/Gaming war with Qcom or Intel and samsung has no gpu yet. AMD has no modem or mobile soc for ARM yet so no assault on phones will happen any time soon so NV really has to screw up their gpu to on their own to be stopped.

When cuda starts getting used more in games this will get even worse. Unlike Mantle NV has 7yrs of Cuda out there with billions invested in it. Nobody else in socs has a cuda like ecosystem that is now being aimed at games not just pro-apps (in nascar 2014 this year). The only other soc vendor that has done ANYTHING in gaming is mediatek and they have ONE. Modern Combat 5 and NV will have that soon enough too so all players are miles behind NV in games.

http://www.geforce.com/landing-page/nvidia-shield-legendary-games-giveaway
Gabe Newell now signing game packs for NV Shield. The rest of the field better up the game (pun intended) or get ready to be run over by NV's gaming prowess. I can't wait for NV's idea of an ARM console box at 150w or 500w PC box. I'd take a $350 ARM/OpenGL console over DirectX xbone any day. I will not support Sony as I'd rather buy USA only as much as possible (our economy sucks today and needs all the help we can give it...LOL). So I'm stuck waiting on an ARM console to compliment my PC gaming. To sell like crazy though it needs to be upgradable (SOC+gpu+HD - 2 out of 3 wouldn't be bad).
 
Status
Not open for further replies.