[citation][nom]tiret[/nom]you know what gets me: is how convenient that every year they reduce the size of the transistor by say 30%. there hasn't been a year to my knowledge where they hit a road block and there hasn't been a year where they made a major improvement either.moral of the story: marketing and planned release of superior tech all to get the most out of us - the consumer.maybe I'm wrong but it all seems way too convenient.[/citation]
there are 2 things at work here;
1) revolutionary releases are bad for everyone. They take consumers by suprise which pisses off those who just purchase a product and will now no longer have 'the best', and it makes people purchase defensively rather than when they want to purchase. On the business side it makes for 'feast and famine' markets rather than a steady income stream, which makes it a lot harder to budget resources on long term projects. Slow, steady, and predictable releases are good for everyone, and Intel is king of that. We already know quite a bit about the next 4 gens of processors coming from Intel over the next 4-5 years. With AMD you simply never know until a month before, and even then you don't know what to expect from it until 'the next OS release fixes it'. Businesses and consumers would buy AMD if they simply knew what AMD was going to do ahead of time, even if they were not the fastest or best deal around, so long as they can plan their upgrade cycle around it then they would be happy.
The exception to this rule is if you can come out with a revolutionary product every year, which is what Apple was doing with Jobs. But this is mostly a matter of marketing so the consumer feels good about each release, but without actuially getting something all-together better than the previous release. Then when a truly revolutionary release does come (like the last iPad), then it pisses everyone off.
2) It is a shift of focus from Intel. With the Pentium 4 Intel was focused on clock speed. They ditched the really great P6 architecture to move to NetBurst specifically to focus on clock speed, and they lost horribly to AMD who showed that you can go much faster with better design than raw horse power. Because of this we saw Intel move from 180nm to 90nm over a 6 year period (00-05), while the die size increased dramatically to some rather huge chips. Then (finally) in '06 Intel's brain turned on and they decided to focus on efficiency. They went back to the P6 core architecture (a glorified Pentium 3), bringing with it all of what they had learned the last 6 years and they found that they could get a sub 2GHz cpu to beat their old 4GHz processors. So the focus was then on core efficiency, which is why we have gone from 65nm to 22nm over the last 6 years, and if you exclude the iGPU then the core CPU is taking extremely small amounts of power compared to the old design.
But just like the GHz wall that was hit before, now we are fast aproaching the nm wall. There is already talk about delays of Broadwell (14nm) due to manufacture problems, so a new paradigm needs to be focused on. Be it materials, or architecture design and extensions, or something else entirely, who knows. But do not mistake a company's obsession with a single focus (GHz or die shrinks) to saying that a company is 'out to get you' or any such silliness. It is true that they are holding back just enough to keep a steady stream of buyers coming to their door, but they are also doing it because they do not have enough innovation to make something amazing every year. If they did, you could bet they would put it to market to fight off the ARM invasion which will be hitting desktops and laptops in the next year or two.