Intel Kills Off Larrabee Discrete Graphics, For Real

Status
Not open for further replies.

Pei-chen

Distinguished
Jul 3, 2007
1,254
0
19,280
0
LArrabee don't have to be GTX480 or 5870 fighter. A 5770 level video card with decent price + driver would still be welcomed in the market.
 
Good riddance, they could have at the least put out something even if it SUCKED that was fair priced would have been better than this. I still got a i740 agp and would have been nice to have added another discrete Intel gpu to my collection. 12 years and nothing new.
 

HalJordan

Distinguished
Jul 16, 2009
257
0
18,780
0
I really thought it was officially dead months ago...I was not aware that Intel considered it "shelved." I suppose this announcement equates to putting a bullet into the head of a wounded animal, lying on the ground, howling in pain, and yearning for a quick death. I never really had high hopes for Larrabee; the idea was a pipe dream at best, and smacked of insanity at worst. Competing with likes of ATI and Nvidia in the GPU market is just not feasible, unless Intel bought out a ton of talent from their competitors. Apparently, Intel did wise up and is sticking with dominating the processor market.
 

ta152h

Distinguished
Apr 1, 2009
1,207
2
19,285
0
It's always good when the plague known as x86 doesn't encroach on any new areas. Why Intel thought an instruction set considered poor 32 years ago, made sense for a modern GPU escaped a lot of people, considering compatibility wouldn't be such an important consideration there.

They'd be better off going with a clean, efficient instruction set if they ever try to get into that market again. How could they expect to compete with well-established players with that handicap?

I still don't know exactly what "we are focused on processor graphics, and we believe media/HD video and mobile computing are the most important areas to focus on moving forward" means. Processor graphics means the IGP that comes with the processor now? The other stuff also means that, even though discrete cards do that too? That's got to be it, but it's not too clear, really.
 

bujuki

Distinguished
Jan 19, 2008
21
0
18,510
0
Does it mean that there will be no ray-tracing game in the near future? Quite sad if it's true, I really looked forward to see that and the day when programmer find more and more new rendering techniques (and of course read those in Tomshardware ^^).
 

mowston

Distinguished
Jun 12, 2007
61
0
18,630
0
Intel's Westmere IGP is already better than AMD's IGP, and competes well with low end discrete graphics. So with Sandy Bridge, Intel will probably be much better than AMD and Nvidia's IGP and compete with their future low-ends. Maybe Intel will actually make something that competes with their mid-range.
 

invlem

Distinguished
Jan 11, 2008
580
0
18,980
0
Honestly they stuck graphics into a CPU that let's me play 1080p and bitstream TrueHD and DTS MA without the need for disrete graphics or a special sound card.

I'm happy :) larrabee or not
 

yrmoma

Distinguished
Jan 27, 2009
48
0
18,530
0
All I have to say is: HAHAHAHAHA. To Intel for talking all that smack before, and to all the people that were doing the same to nVidia and ATI.
 

tayb

Distinguished
Jan 22, 2009
1,143
0
19,280
0
Intel sucks at graphics. Intel GMA has been and looks like it will continue to be terrible. I had no expectations of Larrabee being anything spectacular but I was not expecting it to be such a colossal failure either.
 

LORD_ORION

Distinguished
Sep 12, 2007
814
0
18,980
0
That really blows...

I wasn't looking for a gaming card, I was looking for a parallel task cruncher that worked the same way things work now.

This leaves us with CUDA or OpenCL... which is the lesser of 2 evils? Proprietary format for Nvidia? Or "open" (I call preemptive BS) format by Apple.
 

danbfree

Distinguished
Jun 26, 2008
73
0
18,630
0
As a contractor for Intel testing Sandy Bridge, I can't say much at all but indeed Bill Kircos is not lying when he says integrated graphics will be even better with Sandy Bridge...
 

ravewulf

Distinguished
Oct 20, 2008
934
1
18,985
0
[citation][nom]LORD_ORION[/nom]That really blows...I wasn't looking for a gaming card, I was looking for a parallel task cruncher that worked the same way things work now.This leaves us with CUDA or OpenCL... which is the lesser of 2 evils? Proprietary format for Nvidia? Or "open" (I call preemptive BS) format by Apple.[/citation]
We have more than that.
Nvidia has CUDA
ATI has Stream
They both have OpenCL and DirectCompute

(wow, just realized OpenGL v Direct3D, and now OpenCL v DirectCompute. Déjà vu)
 

NuclearShadow

Distinguished
Sep 20, 2007
1,535
0
19,810
5
I was hoping that Intel could at least provide low-mid level performance and release such. Competition is always a good thing for the consumer and we could have expected Nvidia and ATI to respond with lower prices or more bang for the buck performance. This would have been beneficial to everyone even those who wouldn't have bought Intel's product.
 

Niva

Distinguished
Jul 20, 2006
382
0
18,780
0
nVidia's boss just jizzed his pants upon reading these news. Hopefully those two companies can go back to playing nice now.
 
G

Guest

Guest
This makes sense with PC gaming kind of fledging and the real need for it covered by ATI and Nvidia. I think Intel just realized that this market did not need a third wheel.
 

rambo117

Distinguished
Jun 25, 2008
1,157
0
19,290
4
[citation][nom]aaron686[/nom]It would be nice to see Intel in the GPU market, at least prices might be more reasonable.[/citation]
More reasonable? On Intel's side of marketing? Have you seen the price of their platforms? I could build a capable AMD system for the price of an intel mobo and cpu =\
 
Status
Not open for further replies.

ASK THE COMMUNITY