IBM Teams Up With ARM for 14-nm Processing

Status
Not open for further replies.
Have they figured out how to deal with the Quantum Tunneling problem?
 
[citation][nom]accolite[/nom]Wow 14nm that's crazy I thought the lowest they could go is 20nm then they wouldn't be able to control the flow of electrons![/citation]

I think that's for high-power devices, or IBM didn't hear about that lol.

It's so strange that we're getting so close to the absolute minimum our manufacturing technology can reach, I think within the next 5-10 years we'll be seeing chips with circuitry created at the atomic level, if it isn't already done that way right now.
 
I remember reading an article in a technical journal about how pushing past 100nm would be impossible due to the laws of physics. That was only 15 years ago.
 
[citation][nom]accolite[/nom]Wow 14nm that's crazy I thought the lowest they could go is 20nm then they wouldn't be able to control the flow of electrons![/citation]
From what I understand, the limit is nearer to 3 nm, at which point quantum tunneling allows electrons to jump the barriers between circuits. At that point, I imagine our current technology path for computer design will reach a dead end.

Of course, quantum tunneling isn't necessarily a barrier to our progress. Already, since decades ago, scientists and engineers have been putting forth ideas to use quantum tunneling to our advantage to build 3-dimensional processing units that would have even more performance potential than our current designs!
 
[citation][nom]Alchemy69[/nom]I remember reading an article in a technical journal about how pushing past 100nm would be impossible due to the laws of physics. That was only 15 years ago.[/citation]
I remember the same thing. Back when we were still measuring processing technology in micrometers, it seemed everyone was saying the limit was at 0.1 µm. I remember being very confused when we moved beyond that without hearing anything more about it.

The limitations we'll be facing in the near future seem a bit more fundamental than whatever difficulties were being discussed 15 years ago, though. I don't imagine we'll be going much smaller for a long time once we reach that point. There are always other routes for progress, however.
 
[citation][nom]bombat1994[/nom]imagine a 14nm dual 6970.[/citation]

Simple designs scale down easier, but more complex ones are tricky. You can't just shrink a die and use the very same mask.

Though it would be awesome cool for that. Might actually be workable in a laptop or phone.

I think that IBM/ARM is pushing hard this way to keep Intel out of the mobile CPU biz, or at least in AMD's old shoes. Might be interesting to see how this develops.
 
Looking back at the history in this field, I am 100% sure that they will find a way past every barrier.

Even if they won't be able to go further then 1,3,10 nm for a while, they will find ways of improving the manufacturing process and lowering the price so we'll be able to cope for a few more decades with multiple cores, and gpus, until quantum computers kick in.
 
[citation][nom]accolite[/nom]Wow 14nm that's crazy I thought the lowest they could go is 20nm then they wouldn't be able to control the flow of electrons![/citation]

The neighborhood of 10 nm is the foreseeable limit at this point in time.
 
[citation][nom]bv90andy[/nom]Looking back at the history in this field, I am 100% sure that they will find a way past every barrier.Even if they won't be able to go further then 1,3,10 nm for a while, they will find ways of improving the manufacturing process and lowering the price so we'll be able to cope for a few more decades with multiple cores, and gpus, until quantum computers kick in.[/citation]

Quantum effects must be mastered before going smaller than 10nm. All bets are off regarding rate of progress for CPUs once 10nm is reached.

Few more decades??? Hardly! Intel expects to achieve 11 nm in the year 2015. That is only 4 years away. After that there will be no more die shrink progress without amazing new science. I think we are about to see computer evolution take the form of fiber optics and chip consolidations(merging cpu/gpu, for instance) more common than die shrinks.
 
Before Atomic or Chemical computing, we're gonna switch to graphite (carbon) because it can handle a lot more heat than silicon. Then we'll ramp up core speeds 5x.

So if Sandy Bridge can do 5GHz on air just by switching to 32nm, then 5nm (assumed limit) should be able to do around 10GHz (8GHz to 15GHz is my guess)? Then Carbon will allow us to go near 40 to 75GHz on a 5nm process. So if we can get the carbon process down and just switch the silicon methods over to it (which is completely concievable in the next 15 years), then we'll be running octocore (I'm totally guessing overhead makes more than 8 threads impractical) processors at 50GHz. That should give us about a 40x boost over OC'd Sandy Bridge quad cores even if they were running multi-threaded apps (50GHz/5GHz*200%[Improvement per cycle efficiency]*2[8 cores vs 4]).

The point being--we've got a lot of improvement in the conceivable future without switching to an as-yet-unknown manufacturing process.
 
[citation][nom]joytech22[/nom]I think that's for high-power devices, or IBM didn't hear about that lol.It's so strange that we're getting so close to the absolute minimum our manufacturing technology can reach, I think within the next 5-10 years we'll be seeing chips with circuitry created at the atomic level, if it isn't already done that way right now.[/citation]
No at 12nm you get electrons flowing through PNP gates aka electrons going bye bye into well nowhere
 
I remember a handful of years ago someone stating 40nm was impossible.
I'm so happy they're wrong. :)

We keep pumping billions of dollars with countless engineers. Maybe we will reach a limit soon, maybe we won't. For the time being, I'm more than satisfied just to know that they're trying.
 
thats insane , Being a Electrical and Electronic student at Manchester uni ive been told that when a device gets smaller the electrons can flow though a wall however giving ohmics points which prevents the electrons flowing though. however arent we forgetting graphine , graphine atoms thick but yet can be made into a semicondutor imagine that
 
The problem of the past was electrons getting "stuck" in the pathways and clogging them up, the smaller your pathways the easier it was to clog. They alleviated this issues by using better material and cleaner processing, still any impurities inside the pathways will have electrons sticking to them.

The problem in the near future has more to do with quantum mechanics kicking in, make your path too small and electrons like to jump around and cross boundaries that they couldn't before. Possibly making better boundaries or finding some method to control the jumping would allow even smaller pathways.
 
[citation][nom]jack_6[/nom]thats insane , Being a Electrical and Electronic student at Manchester uni ive been told that when a device gets smaller the electrons can flow though a wall however giving ohmics points which prevents the electrons flowing though. however arent we forgetting graphine , graphine atoms thick but yet can be made into a semicondutor imagine that[/citation]

I think I was just talking about graphene--although I mistakedly said graphite.
 
I really don't understand wht IBM or Intel have not buy ARM again.
Oh I know: Bigger mean slower to think. Not able to imagine the future little bit.
Smile
 
Status
Not open for further replies.