Intel Kills Off Larrabee Discrete Graphics, For Real

Status
Not open for further replies.
Good riddance, they could have at the least put out something even if it SUCKED that was fair priced would have been better than this. I still got a i740 agp and would have been nice to have added another discrete Intel gpu to my collection. 12 years and nothing new.
 
I really thought it was officially dead months ago...I was not aware that Intel considered it "shelved." I suppose this announcement equates to putting a bullet into the head of a wounded animal, lying on the ground, howling in pain, and yearning for a quick death. I never really had high hopes for Larrabee; the idea was a pipe dream at best, and smacked of insanity at worst. Competing with likes of ATI and Nvidia in the GPU market is just not feasible, unless Intel bought out a ton of talent from their competitors. Apparently, Intel did wise up and is sticking with dominating the processor market.
 
It's always good when the plague known as x86 doesn't encroach on any new areas. Why Intel thought an instruction set considered poor 32 years ago, made sense for a modern GPU escaped a lot of people, considering compatibility wouldn't be such an important consideration there.

They'd be better off going with a clean, efficient instruction set if they ever try to get into that market again. How could they expect to compete with well-established players with that handicap?

I still don't know exactly what "we are focused on processor graphics, and we believe media/HD video and mobile computing are the most important areas to focus on moving forward" means. Processor graphics means the IGP that comes with the processor now? The other stuff also means that, even though discrete cards do that too? That's got to be it, but it's not too clear, really.
 
Does it mean that there will be no ray-tracing game in the near future? Quite sad if it's true, I really looked forward to see that and the day when programmer find more and more new rendering techniques (and of course read those in Tomshardware ^^).
 
Intel's Westmere IGP is already better than AMD's IGP, and competes well with low end discrete graphics. So with Sandy Bridge, Intel will probably be much better than AMD and Nvidia's IGP and compete with their future low-ends. Maybe Intel will actually make something that competes with their mid-range.
 
Honestly they stuck graphics into a CPU that let's me play 1080p and bitstream TrueHD and DTS MA without the need for disrete graphics or a special sound card.

I'm happy :) larrabee or not
 
All I have to say is: HAHAHAHAHA. To Intel for talking all that smack before, and to all the people that were doing the same to nVidia and ATI.
 
Intel sucks at graphics. Intel GMA has been and looks like it will continue to be terrible. I had no expectations of Larrabee being anything spectacular but I was not expecting it to be such a colossal failure either.
 
That really blows...

I wasn't looking for a gaming card, I was looking for a parallel task cruncher that worked the same way things work now.

This leaves us with CUDA or OpenCL... which is the lesser of 2 evils? Proprietary format for Nvidia? Or "open" (I call preemptive BS) format by Apple.
 
As a contractor for Intel testing Sandy Bridge, I can't say much at all but indeed Bill Kircos is not lying when he says integrated graphics will be even better with Sandy Bridge...
 
[citation][nom]LORD_ORION[/nom]That really blows...I wasn't looking for a gaming card, I was looking for a parallel task cruncher that worked the same way things work now.This leaves us with CUDA or OpenCL... which is the lesser of 2 evils? Proprietary format for Nvidia? Or "open" (I call preemptive BS) format by Apple.[/citation]
We have more than that.
Nvidia has CUDA
ATI has Stream
They both have OpenCL and DirectCompute

(wow, just realized OpenGL v Direct3D, and now OpenCL v DirectCompute. Déjà vu)
 
I was hoping that Intel could at least provide low-mid level performance and release such. Competition is always a good thing for the consumer and we could have expected Nvidia and ATI to respond with lower prices or more bang for the buck performance. This would have been beneficial to everyone even those who wouldn't have bought Intel's product.
 
nVidia's boss just jizzed his pants upon reading these news. Hopefully those two companies can go back to playing nice now.
 
This makes sense with PC gaming kind of fledging and the real need for it covered by ATI and Nvidia. I think Intel just realized that this market did not need a third wheel.
 
[citation][nom]aaron686[/nom]It would be nice to see Intel in the GPU market, at least prices might be more reasonable.[/citation]
More reasonable? On Intel's side of marketing? Have you seen the price of their platforms? I could build a capable AMD system for the price of an intel mobo and cpu =\
 
Status
Not open for further replies.