[citation][nom]dagger[/nom]Keep in mind the term "core" used for Larrabee is more analogous to, say, a stream processor in traditional gpu, rather than a gpu as a unit. In other words, it's more accurate to compare Larrabe's 16 to a 8800gts's 128.[/citation]
Not really true my friend. Remember that a Larrabee core is really a die shrink of an old pentium processor with a big ol' vector unit and all of intel's newer instruction sets built on top of it.
If we remember where Larrabee started, it was when Intel, contemplating the possibilities of modern silicon fabrication techniques, started to wonder how many old pentium cores they could fit on a chip the size of a core chip. Everything took of from there.
Moral of the story is that if we expect to see power consumption scaling similar to that of modern GPUs on the assumption that a "core" is the same as a "stream processor", we would be mistaken. Larrabees cores will be actual cpu cores, pentium cores to be precise, and thus will draw the kind of power you might expect from something loaded down with general purpose registers, intels own instruction sets (sse, sse2, etc.), L1 and L2 caches, and other things you would see on a CPU. Remember that any direct comparison with a traditional GPU will be quite flawed.
The problem with that is that Intel's marketing strategy is pitting Larrabee directly against offerings from nVIDIA and ATi, a strategy which for many reasons already listed is disasterous. Larrabee will compare with GPUs of the time in neither power consumption nor actual performance.
Larrabee's supposed strength will be in it's flexibility. The fact that it is proposed to combine many of the strengths of a general purpose CPU with the rendering capability of a modern GPU. But simply saying that something is "flexible" is very vague, quite abstract, and doesn't, in and of itself, imply any real advantage when it comes to real-world performance comparisons.
The main problem will be in implementation. It will come down to how well programers view and take advantage of this supposed "flexibility." Also, there remains no margin for error when it comes to drivers. Intel will have to keep all cores fed, and manager resources well if ever they hope to get any kind of performance out of Larrabee. Drivers are especially important if this thing is supposedly API agnostic.
Also, all the programmability in the world doesn't mean anything without real world performance. Yes theoretically Larrabee could be programed in the software to run sm 5.0. It could theoretically be programmed to run sm 10.0, but this means nothing if it doesn't have the actual horsepower to run these advanced algorithms at playable frame rates.
Anyway, that's my two cents.