Intel's Larrabee Delayed Indefinitely

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]tipoo[/nom]Oh well. Maybe the considerable R&D funds they threw into Larrabee will at least make future IGP's suck less.[/citation]
Problem is, as Nvidia have found out chasing ATI's HD5000 series:-

If you ain't first, you're nowhere!
 
[citation][nom]back_by_demand[/nom]Epic, epic FAILIntel, why dont you stick to your corner making CPU's and leave the graphics business to the grown ups.[/citation]

To be fair, Intel also knows how to make Solid State Drives, not just X86 processors.

But they have always been useless at dishing out near-decent graphic cards.
 
As an ex-Intel employee I can sum this up quite easily....

Too many 20 year company people who get projects beyond their talent/capabilities
Not enough new cutting edge people being brought in to drive the advancement of technology
An attitude that purchasing other companies will allow them to leverage new tech (which has worked for them before)
Bureaucracy that would make any government blush

The fail here was human in nature, the tech had merit but any team built in the Intel way is often failed to doom. I am happy to be elsewhere... :)





 
oh, there prolly just dumping there resources into the processeror arcutectiur and holding on integrated graphics on die due to amd announcing the same thing bascly
 
i guess intel need at least 2 years to copy the ATI architecture from AMD by the cross-licensing agreement , just like they copy the K8 architecture to made the core 2 and phenom to i7 .
 
larabee is most likely targeted to mass market rather than for gamers/enthusiasts. this is why I'm not worried that much.
 
IGPs as a separate chip are going to be a thing of the past. An on-die GPU that also handles floating-point math looks to be the future direction, as evidenced by AMD's Fusion. Larrabee doesn't fit in that scenario, so it makes sense to shelve it in favor of a Fusion-like solution. When both get on-die GPUs up and running, Nvidia is finished in the IGP space.

Nevertheless, Larrabee won't be a total loss. Many Larrabee concepts will carryover to multi-core embedded controllers for things like MLC flash controllers, RAID controllers, etc. that can make great use of many x86 integer cores. The same Larrabee-like silicon could be made into any one of several types of embedded controllers with just firmware changes. This is much cheaper to manufacture than individual silicon for each controller type.
 



Dude can you tone down your accent? Can hardly make out the words, lol! What same thing did AMD announce basically?
 
Wondering how about AMD's FUSION?

Hope it is still on its way out. Maybe this time AMD will have something for Intel to play catch up games (finally) which I have not seen for years.
 
Intel still has the largest market share in graphics, but that's only because most PCs in the world are just your basic office PC with integrated graphics.

It was mentioned before here on Tom's that Larrabee was barely on par with current graphics cards. So the only surprise here is that it took them this long to realize they will need to aim higher so that Larrabee will be able to compete with whatever graphics solutions available when it is finally released.

 


Larrabee was never designed with the intent to make it compete with high end cards. As soon as Larrabee hit the wires 3 years ago, the Intel i740 ghost woke up and started annoying gamers again. If you remember, Intel tried this same thing about 11 years ago and got their asses handed to them by Nvidia. I guess they wanted to quit for now or else epic fail history repeats itself. I give them credit for making integrated graphics affordable and certain has value for the casual user.
 
Not surprised. While I admit that the ideas behind Larrabee are interesting, I always felt it was impracticable to use old Pentium cores as shaders and what not. Sure it opens up some interesting computing opportunities, which is why it's likely to be a server and workstation curiosity, but as a GPU it would be too complex, from all the extra transistors to make it fully programmable, if it was to be competitive with the likes of the Radeon 5870, or even the 5770. As NVIDIA is finding out, complex GPUs don't have very good yeilds 😀. ATI already learned this painful lesson with the 2900
 
People seem to be missing the point that this is a delay, not an end. Intel is still committed to it.

I'm not crazy about x86 plaguing video cards too though, so I have mixed feelings.

Nvidia should go belly up soon, and if Intel doesn't compete with ATI, we'll have only one player. It's really better to have these two banging heads. With Nvidia, there are too many problems with their products. With Intel and AMD, you can trust when you buy their products. So, I hope Intel can make a competitive product.
 
In other news, I guess this pretty much finally squashes the PS fanboys' persistently regurgitated rumors about Sony using Larrabee for the the PS4...


Unless, of course, Sony actually intends the launch date of the PS4 to be sometime in 2015...
 
[citation][nom]mindless728[/nom]this is unfortunate, i kind of wanted to use a GPU with x86 instruction set, though i wouldn't use it for gaming, just programming[/citation]

we will.. Nvidia's "Fermi" or whatever name it has now...
 


Nvidia should go belly up soon? Uh, I hate to break the bad news to you buddy but they already did that this morning. Upon hearing this news, they fell to the floor, rolling on their backs laughing.
 
i think they can't compete. If they can't get a new and impro\/ed architecture done in 6 months or less, they'll ne\/er keep up with the release cycles of amd and n\/idia and flat out fail at hardware sales and adoption rates. 😛
 
Status
Not open for further replies.