• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

IBM Proposes Carbon Nanotubes Instead of Silicon for Chips

  • Thread starter Thread starter Guest
  • Start date Start date
Status
Not open for further replies.
No matter what, you can't shrink a transistor past one atom.

http://www.sciencedaily.com/releases/2012/02/120219191244.htm

Miniaturization has taken us quite far from the early 80's, but it's time to start looking towards other innovations to improve computing power. At some point, once nano structures have bottomed out in size, we will need to go back to clock speed and IPC to improve performance on a classical (i.e. non-quantum) computer. Maybe photonics hold the key here, but it's getting to be time for Intel to pull something truly revolutionary out from behind the R&D curtain.
 
Intel has plans up to 2018-2020 for shrinking down to 5nm, which given their R&D budget I'm sure they can accomplish. This is definitely a positive sign well before that time frame to have this tech already being worked with and looking much closer to prime time.

Kudos to IBM. You may be a dinosaur, but you still got the R&D curve and last I heard they were working on an 1Tbps Optical Interconnect for CPU bandwidth.

http://www.extremetech.com/computing/121587-ibm-creates-cheap-standard-cmos-1tbps-holey-optochip

Let the good times keep rollin 😀
 
IBM rules when it comes to research, intel implements and matkets tech better butIBM hands down at creating it. Now if we could just we've that space elevator.
 
[citation][nom]ksampanna[/nom]At what point in the process shrink node do silicon transistors start creating problems?[/citation]

5nm , you start getting funny behavior aside from just plain old electon bleeding, but effects like quantum tunneling.
 
This research is critical because with silicone we will reach a limit where Moore's Law will hit a physical limitation in the next 20 years or so. Maybe more importantly is the Bell Labs research into organic technology as this has the potential to extend Moore's Law by several hundred years.
 
At a certain point, research projects move from being "throw some $ at it" activities to being "bet the farm on it" business-changers. A company that thrives on risk (Intel, Boeing, Lockheed, AMD could probably be counted in this list) will deploy research sooner, while a company that is risk-averse will avoid making changes even when that course of action might lead to reduced profits or its eventual downfall.
Unfortunately, IBM doesn't quite fall into the Intel/Boeing/Lockheed camp. They don't bet the company each time they design a new jumbo-jet, or construct a new chip-Fab, or design a prototype warplane. And even Intel doesn't risk as much as these other companies. If they did, or IBM did, we might see commercial nanotube transistor-based ICs sooner.
 
[citation][nom]ksampanna[/nom]At what point in the process shrink node do silicon transistors start creating problems?[/citation]
they have already 'started' creating problems, which is why Intel started doing the '3D transistor' to help with leakage issues. On a hard structure level I believe the limit is 4-5nm, but for reliable signal flow through a transistor it is theorized that we can get down somewhere near 8-10nm using current materials.

Remember, the smaller you go, the easier it is for electrons to slip through because there is less area to slow things down. However, the smaller you go the easier it is to melt things due to heat because the structures are so much more fragile. So this means that you need substantially less power used to keep things from burning up, but you have an ever shrinking signal to noise ratio, which is why we will likely never make it all the way down to the hard limit of 5nm
 
I hope you people realize that carbon nanotubes were patented for usage in electronics along with means of production/implementation back in 1992 (using the technology from then).

This is old news.

Silicon is woefully inefficient as are numerous other materials used in computers (and throughout industry at large).
Switching to synthetic materials that can be made in abundance (and incidentally ones that are superior) would have solved a lot of our 'problems' - but Capitalism sooner milks outdated/old techs as much as possible for the sake of profits.

I loathe the technological obscurity the system forces onto everyone.
 
[citation][nom]caedenv[/nom]they have already 'started' creating problems, which is why Intel started doing the '3D transistor' to help with leakage issues. On a hard structure level I believe the limit is 4-5nm, but for reliable signal flow through a transistor it is theorized that we can get down somewhere near 8-10nm using current materials.Remember, the smaller you go, the easier it is for electrons to slip through because there is less area to slow things down. However, the smaller you go the easier it is to melt things due to heat because the structures are so much more fragile. So this means that you need substantially less power used to keep things from burning up, but you have an ever shrinking signal to noise ratio, which is why we will likely never make it all the way down to the hard limit of 5nm[/citation]

Not to mention things like solar flares affecting them...didn't Toyota have some issue with this?
 
[citation][nom]deksman[/nom]I hope you people realize that carbon nanotubes were patented for usage in electronics along with means of production/implementation back in 1992 (using the technology from then).This is old news.[/citation]

There is a world of difference between seeing and patenting a usage, and actually bringing it into production. This article is about a production improvement.
 
Doesn't matter.
Bringing carbon nanotubes into the play back in 1992 would have created first prototypes then which would be hybridized with silicon and therefore increase efficiency across the board (oh and lets not forget using them in lithium ion batteries, laptop/desktop casings, displays, etc.).

Synthetic diamonds could have followed suit in 1996.

Oh and commercial companies like Intel use Silicon and other inefficient materials in electronics because they are 'cheap' from a $$ point of view, and they DON'T create electronics with the BEST possible results the material is capable of the times - instead they create much less efficient electronics/tools that they will revise once every 12 to 24 months because that brings long term profits.

We live in technological obscurity and have resource shortages because of this profit nonsense.
If profits weren't the goal, then switching to superior synthetic materials a long time ago and creating the BEST of what is possible/efficient and in line with our latest scientific knowledge, the children toys we have today would be skipped and we'd be more advanced in turn.
 
Status
Not open for further replies.

TRENDING THREADS