Id add, as SoC becomes more and more common, outsourcing becomes more likely at Intel.
If Larrabee doesnt pan out for graphics rendering, this too could drastically change Intels overall plans as well. If its just good enough, but cant keep up with traditional gpus, theyll have to look elsewhere for a solution. I know that its maybe this and that, but it is all very possible. This in itself could put Intel behind, and outsourcing would be sped along at a greater pace, or, as the need arises.
This is of course reliant upon AMD mostly, and partly on nVidia as well, when fusion happens , in HW and SW solutions.
Heres what Im talking about, or explains it much better than I
http://venturebeat.com/2009/05/22/interview-stanfords-bill-dally-leaps-from-academia-to-the-computer-graphics-wars/
Heres a quote:
"However you package it, the PC of the future is going to be a heterogeneous machine. It could have a small number of cores (processing units) optimized for delivering performance on a single thread (or one operating program). You can think of these as latency processors. They are optimized for latency (the time it takes to go back and forth in an interaction). Then there will be a lot of cores optimized to deliver throughput (how many tasks can be done in a given time). Today, these throughput processors are the GPU. Over time, the GPU is evolving to be a more general-purpose throughput computing engine that is used in places beyond where it is used today."
If this becomes "the" change, along with the higher costs of fabbing, Intel has to persue both, and hit their mark. Im not saying they wont, or cant, but it could change the way they do things, especially, like I said, if LRB doesnt produce