Intel's 10nm Cannonlake Chips Won't Arrive Until Second Half Of 2017

Status
Not open for further replies.

Xivilain

Reputable
Apr 16, 2014
269
0
4,960
65
IBM will more than likely sell the intellectual property than produce chips again.

Intel is rolling in the dough while they're ahead, with no competition in some markets.

As an Intel and Nvidia consumer... I want AMD to make a HUGE comeback.
 

jimmysmitty

Champion
Moderator
The thing is that Intel does a lot of the R&D themselves. They do utilize other technologies but they take them to see their viability for their process. Intel has been aware for quite some time that beyond 10nm will require a new material as silicon is getting stretched further than it can go.

IBM however is working in a consortium to move to the next process node because without that none of the other companies would have the R&D experience to keep up with Intel let alone beat them to a node.

That said, the viability of the 7nm process node with a functioning CPU is what matters as right now it is just transistors.

There are also rumors that even SiGe will need to be replaced beyond 7nm.
 

digitaldoc

Judicious
Ambassador
With not much competition, they are delaying the inevitable. Will there be anything smaller than 7 nm? At some point the die shrink will need to stop. This can get followed with additional cores, but then that will get maxed out at a certain point where more cores does not help as data is transferred around to get processed.
 

jimmysmitty

Champion
Moderator


Well they could come across some miracle technology that could get them to Picometer size. I mean people probably though Nanometer was near impossible back in the old days.

That said, the best solution would probably be stacking but then there are the issues with heat dissipation.
 

aldaia

Distinguished
Oct 22, 2010
513
6
18,995
1
It's all about money, even Moore's law is not about technology but about economy. When Intel was full of money coming from the PC industry it managed to pull ahead of its competitors. AMD reached a maximum CAPEX of $3 billion, and couldn't follow Intel afterwards, in the end it was forced to get rid of its foundry. During the last years however, Intel CAPEX has reached a maximum at about 11 billion and then started to decline. Meanwhile, TSMC and Samsung have been flooded with money from the mobile industry (Apple in particular has very big pockets). Samsung Electronics is expected to reach an all-time high CAPEX of US$15 billion in 2015, TSMC will be $10.8 billion, while Intel will spend little above $8 billion. Just follow the money.
 
Die size:
At some point the effort becomes so hard that even if you could it's cost prohibitive. We'll start to see a shift to using different materials.

Performance can't keep ramping up. Once die size gets to a certain point we're only left with architectural tweaks and ramping up frequency by using different materials.

I didn't include increasing the number of cores because that stops being possible once die size shrinking stops.

More efficient use will go a long way (software) or offloading to the cloud but in general we'll see a diminishing return of performance.

On the other hand we can also just add a few more CPU's or GPU's to a system as they get cheaper and software supports them better.

I think the desktop PC will end up being "good enough" with net connectivity taking up the slack.
 

Bartendalot

Distinguished
Apr 18, 2010
174
0
18,690
4
Die size:
At some point the effort becomes so hard that even if you could it's cost prohibitive. We'll start to see a shift to using different materials.

Performance can't keep ramping up. Once die size gets to a certain point we're only left with architectural tweaks and ramping up frequency by using different materials.

I didn't include increasing the number of cores because that stops being possible once die size shrinking stops.

More efficient use will go a long way (software) or offloading to the cloud but in general we'll see a diminishing return of performance.

On the other hand we can also just add a few more CPU's or GPU's to a system as they get cheaper and software supports them better.

I think the desktop PC will end up being "good enough" with net connectivity taking up the slack.
 

Aspiring techie

Reputable
Mar 24, 2015
824
8
5,365
118
It looks like the era of process node shrinks is coming to an end. They can't go on forever. 1nm is the width of 10 graphene atoms, so the hard limit appears to be coming within the next decade or two. Still, we need to pursue the new technologies and enjoy the improvements while they still coming.
 

Bartendalot

Distinguished
Apr 18, 2010
174
0
18,690
4
Sorry for the quote without anything added to it.

I meant to talk about the possibility of more cores giving Intel plenty of headroom to play with. We are already at 18 cores with the Xeon at 14nm. If they sell a 10 core consumer Skylake, it will be perceived as a better chip (Kaby lake?). Assuming they can stretch the number of cores as we get to 10nm, I wouldn't be surprised if that is how the keep selling 14nm then 10nm chips for several years.

I'm not convinced that 7nm will be functional by 2020. There are physical limitations that we are on the threshold of.

What we as consumers should start thinking about is perceived improvement vs actual value. How much better is Cannonlake going to be than Sandy Bridge? Another way to look at it is if you have a car that can top out at 200 MPH, how much value is there in a car that can hit 210 MPH. At some point, you aren't going to notice a difference unless you are looking for it by benchmarking.

For gaming, I'm sticking with my Nehalem x5650 until my mobo dies or there is a compelling reason to upgrade. There isn't anything Skylake is offering that makes an upgrade attractive.



 

Bartendalot

Distinguished
Apr 18, 2010
174
0
18,690
4


People talk about needing a technological breakthrough like quantum computing or utilizing carbon nanotubes as though it is inevitable.

There is a good possibility that we will be stuck right about where we are (and have been for the last 5 or so years) for quite sometime.
 

none12345

Distinguished
Apr 27, 2013
431
2
18,785
0
Beyond 7nm i woudl say is in serious doubt at this point. Certainly not impossible, but they are going to need to bridge some pretty big canyons to get there.

Couple that with the fact that the last few process shrinks have brought very little performance(efficiency sure, but not really much more performance), and we are prety close to a wall for computing. Short of a drastically new technology anyway. The industry has been shrinking silicon for 40 years, and thats about at the end.
 

Aspiring techie

Reputable
Mar 24, 2015
824
8
5,365
118

First of all, carbon nanotubes won't get us smaller than 1 nm. Graphene is basically a layer of carbon 1 atom thick while nanotubes are basically graphene rolled up into a tube shape. 1 nm is the the smallest we will get with conventional computing.

Quantum computers look like they have potential, just not for a long time. A fully functional quantum computer hasn't even been built yet. Even if the barriers of physics are worked around, then the first quantum computers will probably cost millions and take up a whole room (just like computers from 75 years ago). If, and that's a big if, quantum computers are made affordable to the masses, then it will take years (probably over a decade) for them to become more popular than conventional computers. Plus, maintainance and repair will most likely be so drastically different that a whole new set of skills (probably more advanced than todays) will have to be learned, putting us out of business so to speak. However, this is (hopefully) unlikely and decades down the road.
 

InvalidError

Titan
Moderator

Nanometer size is still dozens of atoms wide. Below 1nm, that would be atomic-scale wires where it becomes nearly impossible to guarantee electrical continuity along the traces, electrical resistance would be a major issue, along with leakage, electrical and magnetic coupling, leakage, etc.

There might be a breakthrough but for I'm not expecting to see it in a consumer-friendly format and price range in my lifetime.
 

turkey3_scratch

Polypheme
Ambassador


You have to remember that the more cores there are, each core has to be packed within a smaller space. So, theoretically these smaller nodes already do exists if you consider each core to be its own processing unit.
 

Buwan

Reputable
May 16, 2014
9
0
4,510
0
Does this mean sockets will have a longer life or is Intel still determined to follow the path of manufactured obsolescence.
 
Node shrinks are a nice technological advancement but they are not a panacea for performance, power, thermals and efficiency.

Time-to-retail market for Intel on node shrinks has been increasing since 32nm > 22nm went months beyond 2 years.

I also suspect future node shrinks will be more 'hybrid' in nature -- combining elements of the smaller node with logic/design from its older brethren.

 

jimmysmitty

Champion
Moderator




I agree. We probably wont see anything.



This has nothing to do with Intel vs AMD. This has to do with the fact that smaller nodes are harder to get to because of limitations of the machinery or materials. Even if AMD was competitive it wouldn't help Intel get to a more complex node any faster.



The sockets have nothing to do with the node size. The sockets change because Intel adds features from the chipset to the CPU or new features to the CPU/chipset and the proper way to move forward is to create a new socket designed for those features instead of trying to re-use the same socket continuously.

For example, a AMD 760G chipset can support a Bulldozer chip but cannot take advantage of all the features in the chip so you are essentially losing possible performance due to an older chipset.
 

Buwan

Reputable
May 16, 2014
9
0
4,510
0
The sockets have nothing to do with the node size.
I was not so much referring to the node size but the 6 month increase in cadence. If you are no longer following Tic Toc does that mean there will be an increase in life span for motherboard and CPU compatibility?

Is it really necessary to change the socket when all you are receiving is a 10% performance increase anyway and feature benefits are of even less value? I am interested in how Intel expect to value add with the difficulty and diminishing returns of node shrinks.

It is obvious that Intel are in urgent need of a new business plan. Maybe they will start concentrating more on integrated GPUs but this of course will be decided by the market and not Intel.
 

jimmysmitty

Champion
Moderator


Look at it this way. AMD held onto the AM2/AM2+ socket for so long that their performance increases were very small. Also look at the fact that AMD has yet to even add PCIe 3.0.

Sure they seem trivial now but would you rather have a setup that can last 5 years or 2 years and you need a new CPU and probably board?

A i5 2500K is still a good CPU for a gaming system but a Phenom II is not. I would rather have a CPU that is almost 5 years old that is still valid than one that is almost 5 years old that is not.

And why does Intel need a new business plan? They are still making tons of money and outselling AMD in every market.
 

Justkeeplookin

Reputable
Feb 17, 2015
637
0
5,060
33
There is probably gonna be a limit on how far they can go. Beyond 10nm the use of a different materiel would be probable. I am just gonna stick with Haswell for the next 2-4 years as i will see if it is worth it. I already don't think Broadwell is worth it since of the drastic changes i would have to make. Maybe the integrated GPU's will be all is left to work on after limits have been reached.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS