Intel Has 5 nm Processors in Sight

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]deksman[/nom]Actually, there is a big difference between the current system of 'consume and discard' and 'sustainability'.We are not using our technology to better our lives at all... we use it for 'profit'.And in case you hadn't noticed... we are doing quite a lot of damage to the environment (the very same environment we depend on) while recycling next to nothing or doing things that can minimize our footprint.[/citation]
I am an environmental scientist so I know a little bit about these sorts of things. It is important to remember that all things are cyclical even in the natural environment. The environment is not a static entity. All things are in a state of fluctuation. Adaptation is the way nature deals with this fluctuation. To say that the environment should be X and it is now Y and needs to be changed back to X and stay that way forever is a naive way of looking at the world. Eventually our consumptive nature will have the natural consequence of changing the environment so much that we will either have to adapt to the new environment or perish. That wouldn't be the first time a species has self imploded and taken other species down with it on this planet it has happened many many times before.
The moral to this story... relax and have another slice of pie.
 
[citation][nom]blazorthon[/nom]Ivy Bridge was delayed. That's not spot on. Besides, as many others have said, getting to very small process nodes not only has a diminishing returns on how much it impacts power consumption, but is also increasingly difficult. Say that even if Intel is ready for it, maybe other issues such as in building equipment that can work on such small scales has issues. Whether or not Intel manages to be ready might not matter if something else comes up. Six to eight years is a very long time to try planning ahead in the tech industry, things tend to not be on time. [/citation]

Yeah indeed, how long did they draw on both the 22nm AND the 3dgate process that was made the same cycle? Yet they landed a mere 2 months from their set date with both new techs on the table. That is sure within the 25 (or was it 20, didn't pay much attention as the number were laughably over exaggerated even for an amd fan) years range your claimed were common. I think the egg-heads who are developing the next gen have a fair idea of how long time it will take or perhaps you think you know better?
 
One of the things that I have not seen commented upon is the lifetime of the chips. The expected lifetime of a chip reduces on every die shrink and is also reduced by the temperature and frequency it operates at. If you follow the curve by the time we get to 5nm the life of the chip will be only a few months.
 
[citation][nom]blazorthon[/nom]Making a single one atom transistor and being able to make complex processors out of them are two different things. Given how difficult it was just to make the one single-atom transistor (which they only estimated was there, they didn't have concrete proof), I doubt that single-atom transistor CPUs with marketable performance are feasible yet.[/citation]

True, obviously we won't be seeing 1-atom transistors anytime soon, and maybe never. But if they have some progress right now, they might have it 20-30 years from now. And Intel won't hit a roadblock because there are many more studies like this going on like http://www.tomshardware.com/news/science-research-transistor-more-efficient,16256.html
http://mashable.com/2010/12/28/1000-core-chip-could-make-pcs-20-times-faster/
and loads like these. Even if just 1 or 2 of these studies become feasible, transistor CPU's won't become fully mature anytime soon.
 
[citation][nom]rantoc[/nom]Yeah indeed, how long did they draw on both the 22nm AND the 3dgate process that was made the same cycle? Yet they landed a mere 2 months from their set date with both new techs on the table. That is sure within the 25 (or was it 20, didn't pay much attention as the number were laughably over exaggerated even for an amd fan) years range your claimed were common. I think the egg-heads who are developing the next gen have a fair idea of how long time it will take or perhaps you think you know better?[/citation]

Being both 22nm and Tri-Gate doesn't make it any more difficult to work on than if it was only one or ther other with otherwise current process technology. It's simply another process technology.

Whether or not their "egg-heads" are more intelligent than me is completely irrelevant. Anything and everything can go wrong between now and the 5nm projected date and they aren't capable of doing somethig about everything. I'm not saying that it won't happen, only that I think that the time frames are less than accurate, especially given that some of the other dates are already wrong anyway.

I'm not being a fanboy or whatever, simply stating facts as well as opinions that I formed from them and some reasoning behind them. Agree or disagree if you want to, that's your right, but making an ass out of yourself for disagreeing with me about something that you don't even seem to have a good grip on is a complete waste of your time, my time, and the time of anyone else who now has to read this worthless conversation.
 
[citation][nom]pacioli[/nom]I am an environmental scientist so I know a little bit about these sorts of things. It is important to remember that all things are cyclical even in the natural environment. The environment is not a static entity. All things are in a state of fluctuation. Adaptation is the way nature deals with this fluctuation. To say that the environment should be X and it is now Y and needs to be changed back to X and stay that way forever is a naive way of looking at the world. Eventually our consumptive nature will have the natural consequence of changing the environment so much that we will either have to adapt to the new environment or perish. That wouldn't be the first time a species has self imploded and taken other species down with it on this planet it has happened many many times before.The moral to this story... relax and have another slice of pie.[/citation]

That comment you are referring to 'have another slice of pie' reminds me of the inherent stupidity of the monetary system.
The only reason people apply rampant consumption in the first place is behavior and lack of relevant general education (both of which can be changed).
There is no reason for us to discard technology... merely to use it differently (efficiently) than we do now.
We can use it to create much more advanced technologies than we do now (by eliminating notion of 'cost' and planned obsolescence), repair the damage done to coral reefs, clean up mass pollution, use the landfills to create superior synthetic materials in abundance/energy sources, and instead of growing food like we do now (poisoning ourselves in the process and destroying land) we could have been growing it for decades now in fully automated vertical farms employing hydroponics, aquaponics and aeroponics (minimizing our footprint) and even transition to geothermal for baseload planet-wide (with wind used as a supplement) by 1929.

 
I'm not sure where I read this or if I'm just remembering wrong, but I remember something about GPGPU'ing AI's. :)

I get the joke, but CTO may be more applicable, I think. Just saying... Hehehe...

You've made a foolish, close-minded comment, and IMO the kind of thinking shown by that comment is counter-innovative and may have contributed to why the Phlogiston Theory and Aristotle's basic elements proposition so dominant and accepted for so long though they have been proven false.
As a kind of innovation people who think that way may have missed, working around it by other methods, as mentioned by others is an example of a "breakthrough."

I totally agree with you. Physics, objectively, is "immutable" (though I remember hearing something about black holes twisting the laws of Physics, though that may be something within Physics itself, I wouldn't know 😛). Some philosophers may have something else to say about that (the objectivity) though if you know what I mean. :lol: The commenter also acknowledges the fact that we probably don't know everything. Mankind needs as many open-minds as it can get IMO. :)

I totally agree with you as well, though there might be more to what a scientific law is because that definition you gave sounds like what makes a theory to me. :) Talking about scientists "tinkering" with things concerning the speed of light, I remember my brother mentioning scientists finding a way to slow down light, making its speed not so absolute anymore. Just sharing in case someone wants to look into it more. I think some scientists can be close-minded as history seems to have shown, so some may accept whatever laws are accepted, as absolute facts and tell you they are. You got a good mind about science. :)


Scientific laws are made laws by humans, and thus are prone to human error. They do have strict standards but I would say there's always the possibility of a flaw or imperfection. For example, Newton's Laws of Motion does not apply to all situations. Just found that out now when I looked up when I was searching for disproven laws, though this isn't really an example of one as it seems, and I didn't really dig too deep. Hehehe! Try looking up Einstein's Special Theory of Relativity and how it concerns Newton's laws.
 
[citation][nom]balister[/nom]Unfortuneately, Intel is about the hit the wall in how far they can go. 1 nm is pretty much the wall as that's about 5 atoms in width. Quantum effects start to take over once you get to that level and it is not as easily dealt with due to things like Heisenberg's Uncertainty Principle and how the Strong and Weak forces start being a much bigger factor.[/citation]

Moore's law never said that transistors would half in size every two years, only that the number of transistors on a chip would double. If line widths and the size of conductors become a limitation, then three-dimensional construction would be the most practical solution. Current processes are planar with only the most recent Ivy-Bridge chips using fin-FET or "3D-gate" transistors. There is plenty of room for more transistors on a given substrate if you start to build up instead of shrinking features two-dimensionally. Besides, the electron valence jumps you are worried about are introduced at 14nm and already this hurdle has been adequately addressed for now by current processes. Don't worry, be happy.
 
Interesting note about the speed of light in a vacuum (light travels slower through occupied mediums). It's not impossible to travel faster then light, it's just impossible to accelerate matter beyond the speed of light. It's a math problem created with e=mc^2. You end up needing infinite energy to from sub-light to super-light speeds, something that is already at super-light speeds could stay there though slowing down under light speed would release infinite energy. No doubt that there will eventually be a solution "around" that problem but not until we know a whole lot more about gravity and the fabric of space time.

Interestingly enough, it seems the speed of light is not as absolute as we thought it to be. We already know that gravity can bend light, turns out it seems to have other effects on particles. Neutrinos fired from one part of the earth to another (through the middle) arrived faster then they should of, actually they seem to of traveled faster then light. The gravity well of the planet seems to of had something to do with it, though their double and triple checking all their experimental data to be absolutely sure before they proceed further.
 
the process is going smaller and smaller. to the point chips wont even be necessary, we will just plug into the universe
Matrix anyone????
 
[citation][nom]palladin9479[/nom]Interesting note about the speed of light in a vacuum (light travels slower through occupied mediums). It's not impossible to travel faster then light, it's just impossible to accelerate matter beyond the speed of light. It's a math problem created with e=mc^2. You end up needing infinite energy to from sub-light to super-light speeds, something that is already at super-light speeds could stay there though slowing down under light speed would release infinite energy. No doubt that there will eventually be a solution "around" that problem but not until we know a whole lot more about gravity and the fabric of space time.Interestingly enough, it seems the speed of light is not as absolute as we thought it to be. We already know that gravity can bend light, turns out it seems to have other effects on particles. Neutrinos fired from one part of the earth to another (through the middle) arrived faster then they should of, actually they seem to of traveled faster then light. The gravity well of the planet seems to of had something to do with it, though their double and triple checking all their experimental data to be absolutely sure before they proceed further.[/citation]

Those neutrino tests were proven to be inaccurate at best because the margin of error was larger than the amount of time that it'd take something to move at the speed of light between the testing sites. However, that doesn't prove that it's impossible, only that we need to use more accurate tech than was used in those tests.
 


It was a lose fiber optic cable in their timing system, resulted in their clock being slightly off. The OPERA team found the problem themselves. Just looked up the news as I hadn't heard about it in awhile. We definitely need to learn more about gravity propagation, dark energy shouldn't exist yet current theory can't explain the total mass energy of the universe.
 


Meh, dark energy and dark matter are just excuses for the scientists having no clue as to why their math doesn't work as far as I'm concerned. If they were real, then we should be able to see the effects of them. For example, even if it was invisible, dark matter should still be noticeable by the intense gravitational lens effect (the greater the gravity, the greater the bending of light passing near or threw it) that it should have on visible matter that it is behind it when this dark matter is between us and that visible matter. Dark energy should have the opposite effect.
 

HAHA! We've sidetracked so much with this talk of Physics. :lol: Though it is interesting to me. I remember watching a few documentaries on Dark Matter. I vaguely remember stuff about it, but I think they were able to observe some of its properties somehow.
 


They observe how their math doesn't work and they have to account for still unknown variables to get anything seemingly accurate and claim that the discrepancy must be something else out there instead of them not understanding how gravity works. As far as I'm aware, that's all that they've observed, although I don't claim to know for sure.

Yeah, we did get side-tracked :)
 
[citation][nom]Tomfreak[/nom]I cant imaging how they gonna make a proper chip @ 5nm. isnt the size is where the silicon are at its limit and will fall apart. I assume they are trying new trick to move smaller. New thing isnt good until proven in the market. Being a conservative myself I'll buy a rig with 14nm & use more than 5years. And see how they apply those new tricks in 5years. I will not get anything below 10nm unless they go thorough in the market without problems for 3-5yrs.[/citation]

Carbon nano tubes and Graphene
 
Instead of constantly shrinking the should develop new more efficient and faster architecture for a few years returning to the 32nm die then shrink it seems like a better IDEA.

Also I would have really liked to see C2D make it down to 22NM and see how they overclocked
both c2d chips I owned/own overclock to 160% pretty effortlessly.
 
[citation][nom]spentshells[/nom]Instead of constantly shrinking the should develop new more efficient and faster architecture for a few years returning to the 32nm die then shrink it seems like a better IDEA. Also I would have really liked to see C2D make it down to 22NM and see how they overclockedboth c2d chips I owned/own overclock to 160% pretty effortlessly.[/citation]

Why would Intel return to their 32nm process? The 22nm process has already proven to be better.
 


I agree it's a mathematical anomaly. The standard model works for everything we've observed so far, it can be used to describe everything in our solar system. Yet when used on a massive scale it falls short, there simply not enough observable energy / matter in the universe for the standard model to be right, or there are things we can't detect.

Personally I'm not above thinking back to aether theory, or rather having multiple dimensions of spacetime folded together into our four observable ones. Heck we can't even really describe why gravity distorts spacetime, only that it does.
 


That's probably the entire cause of the issues with their math IMO. They don't have a proper understanding of gravity, so when it behaves in a seemingly unexpected way, instead of admitting that they have no clue about what's wrong, they make up excuses to look like their funding isn't being wasted. Even though the models seem to work within our solar system, the problems with their math might simply be scaled down so much that they don't see them and/or they reach into the margins of error.
 
[citation][nom]blazorthon[/nom]if AMD fails, then Intel will get killed by anti-trust lawsuits and we might have no major x86 CPU company. Intel wouldn't allow that and would quite literally bail AMD out if they had to in order to keep AMD running.[/citation]
So intel will buy AMD CPUs to keep them in business? 😉
 
[citation][nom]belardo[/nom]So intel will buy AMD CPUs to keep them in business?[/citation]

Intel would do what they need to do. I think that they'd just invest in AMD or something like that rather than buy a bunch of AMD products if AMD was in danger of going under, but that's speculation on my part.
 
Status
Not open for further replies.