haven't they been saying this kind of stuff for years like we would never get past 32nm.

now will chip makers eventually hit a barrier? yes they will, although it's probably be on a smaller nm and it will probably be quite a while before that happens.
 
Basically this has always been the case where they predict the wall but then someone somewhere figures a new way or a new tool to produce these chips economically it might happen a little slower as they get smaller but technology will always advance.
 

Kewlx25

Distinguished


actually I've been reading these kinds of things since the Pentium era and 320nm

They're already looking at replacing silicon with another semiconductor. Some graphite based stuff and all of the companies are looking at it. They claim current 32nm process on this graphite could create a chip between 10ghz and 100ghz.

I think we're still good. And if nano-tubes ever get figured out, chips will be even faster than that.
 

paranoidmage

Distinguished
Feb 9, 2008
267
0
18,790


Light and lasers are being looked at as a higher bandwidth data transport system. It sould be used as a die's interconnect, not the transistors that do the computing.
 

capital_one

Distinguished
Feb 21, 2010
350
0
18,780



ya same space

that's the word i was looking for "same space"

that's what moore's law implied "double in two years in the same space"

thanks for adding clarity cool :sol:
 
Wow. Unless I misunderstand, the prediction is stupid. He didn't say that 18nm would be a tech wall, he said it would be a money vs returns wall. Umm... Yeah and I'm sure in the 1990s people thought 45nm would be extremely expensive... And it probably was at first.
 

capital_one

Distinguished
Feb 21, 2010
350
0
18,780



We don't refer to it as jumping off the cliff. We refer to it as running into a brick wall. (Laughter). Our strategy is we just motor along at 200 mph, and when we hit the brick wall, we'll hit it at full speed.

We're not slowing down in advance of the brick wall. The brick wall is typically looked at as the smallest dimension that you can use to have a functioning transistor, and we kind of operate in the 90 nanometer range today.

Most theoretical guesstimates are that when you get down to the 5 nanometer range, these things physically don't work anymore, because you've run out of atoms.

You can't split an atom in half or you can't split a molecule in half. But to get to 5 nanometers, you go from 90 to 65 to 45 to 30 to 20 to 12 to 5. So, there's six or so generations between now and then. The next six generations, we'll run 200 miles an hour, full speed ahead.

It's an integral part of following Moore's Law. (Intel co-founder Gordon Moore noted three decades ago that the number of transistors on a chip doubles about every 18 months.) It's an integral part of our whole business strategy, get more functionality per unit area. That's what the integrated circuit is all about. Our intent is to continue to push that as hard as we can.

http://www.sfgate.com/cgi-bin/article.cgi?file=/chronicle/archive/2004/09/26/BUGV88SI8T1.DTL

http://www.nytimes.com/2004/05/17/business/technology-intel-s-big-shift-after-hitting-technical-wall.html?pagewanted=1

barrett1423_270x404.jpg
 
Thing is that people thought that 65nm would be that brick wall but Intel and AMD both broke through without a problem to 45nm. Now Intel is on 32nm, set for 28nm in its NAND RAM area and has working 22nm for CPUs. AMD will be breaking out 32nm in 2011.

There will be a harder to come around wall but by then it will become something else. The thing with the semiconductor arena is that someone always innovates. Back in the early 1900s we used gigantic vaccum tubes and punch cards to compute. Back in the 50s we first got a transistor. Albiet it was huge and probably the same millions that a Core i7, Phenom II or the 2 BILLION transistor Itanium all have would take up large amounts of space.

But they got smaller and then there was a breakthrough with silicon that gave us what we have today. Then there was CMOS, SOI and now HK/MG. Plus that link I posted is one I have been looking for for quite a while. Back in 2006 Intel did say they had found a way to implement fiber into silicon which would be a major breakthrough since it would give us light speed tech. Currently in France there is a 155 channel Fiber backbone that each channel (each tiny strand of fiber) can run at up to 100Gbps.

So IF Intel can manage that we could see a HUGE boost in overall performance and of course GPUs, mobos and everything else that uses silicone or circuits will be able to utilize it.

Just to show this, Google is going to have their own internet that will run at 1Gbps to each of its 500K customers here in the US. No idea where it is but it even puts FiOS to shame and FiOS is a completely fiber based system.

Just so I don't look crazy, here is a link:

http://www.google.com/appserve/fiberrfi/

From Google themselves but they are one crazy company.

In the end though, I can bet that most of the major tech breakthroughs will come from either Intel or IBM in the silicon world. No offense to AMD, but IBM has been their silicon advancement pony for a while hence why AMD uses SOI and will be using IBMs 32nm HK/MG.

Another example of fiber being the next step to pretty much anything is Intels Light Peak:

http://techresearch.intel.com/articles/None/1813.htm

It is set to replace DVI, HDMI, USB and pretty much ANY port that uses wires to connect. It will start at 10Gbps but will scale, again like the French Fiber backbone, to 100Gbps per connection. That would make HDMI look slow since its top rated is 8.6Gbps per connector for video and thats for version 1.4. Hell the fastest Light Peak would make QPI /HTT look slow.

Sorry for the long post, but I got a tad excited about the possibilities. I am one who knows we will always move forward in tech and someday probably end up like Star Trek which would be cool if it happened in my life time......