[citation][nom]rantoc[/nom]Considering Intel's track record with their manufacturing processes have been pretty much spot on during the last decade i really doubt that statement will be true in this case.[/citation]
Ivy Bridge was delayed. That's not spot on. Besides, as many others have said, getting to very small process nodes not only has a diminishing returns on how much it impacts power consumption, but is also increasingly difficult. Say that even if Intel is ready for it, maybe other issues such as in building equipment that can work on such small scales has issues. Whether or not Intel manages to be ready might not matter if something else comes up. Six to eight years is a very long time to try planning ahead in the tech industry, things tend to not be on time. They are usually either early or more often, late. Intel is no stranger to being late for something.
[citation][nom]deksman[/nom]5nm the lowest they can go?Guys... Intel is still using Silicon for making consumer grade electronics.Its a cheap material that's woefully ineffective.Semiconductors from synthetic diamonds were patented in 1996 for one thing and we had the ability to integrate them into computers by 1997.By the year 2000, cpu's could have mostly been made from diamonds as is, and computers would be about 40x more powerful, wouldn't require active cooling because of their inherent properties, and would draw 1 tenth of power that computers draw now.Graphene could have been used wherever possible since 2006 in diamond, while a band-gap problem on Graphene was solved in 2009.IBM even made/demonstrated a full graphene cpu in 2012.Intel is toying in technological obscurity because they use 'cheap' materials and means of production.Cost efficiency (which has NOTHING to do with the amount of resources at our disposal or technological ability to do something - in abundance no less) simply means 'technical inefficiency' (its good for capitalism because of profits, but it doesn't do a thing for 'innovation').[/citation]
You make some excellent points that are undeniably true, but do keep in mind that most people here already acknowledged that moving away from silicon would at the least probably help get past 5nm, although how far past would remain to be seen. As others have said, we might have ten, maybe twenty years before silicon is no longer feasible (depending on just how long it takes us to reach silicon's practical limits) in its current use and at that point we'll either have to switch to some sort of 3D chips (different from 3D transistors such as Intel's 22nm process as I'm sure you understand), move on to another material, or something else.
As for graphene, well, graphene is still a little new for that kind of thing relative to when the necessary breakthroughs came about (although by now, it shouldn't be unreasonable to have used it if we wanted too). Diamond, as you said, has been a practical option for a long time. As you said, we're still on silicon strictly to slow down innovation and maximize profits. Diamond has been dirt cheap to manufacture and use for something like this for years, Intel and such simply don't want to use it when they can still milk maybe another decade of profit out of silicon.