Cnet: "the first Larrabee products will be too slow"

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


Here:

http://www.dailytech.com/Intel+Sheds+Light+on+Larrabee+Dismisses+NVIDIA+CUDA/article12256.htm

and the response

http://www.dailytech.com/NVIDIA+Clears+Water+Muddied+by+Larrabee/article12585.htm





Anyway, regarding complete/incomplete libraries. I always appreciate new functions being added to any language.

I was more getting at missing basic functionality that would otherwise be available if you were using an x86 CPU instead of a GPU.
 
Hey I didn't pin it on you BM. I was just saying the reason for the decline in PC gaming is not Intels fault but more along the lines that game systems are cheap. Now why they are cheap is usually because they would use older hardware that didn't cost as much.

In my eyes though PC gaming is the best.

BTY @ Gphoria just gave Halo 3 GoTY and MGS4 somehow beat Crysis in Best Graphics. Just goes to show ya that consoles get the better treatment even when PCs are better.
 


I read them a few days ago.

Quoting Daily Tech, the first link (Intel Sheds Light on "Larrabee", Dismisses NVIDIA CUDA):

According to Gelsinger, programmers simply don%u2019t have enough time to learn how to program for new architectures like CUDA. Gelsinger told Custom PC, %u201CThe problem that we%u2019ve seen over and over and over again in the computing industry is that there%u2019s a cool new idea, and it promises a 10x or 20x performance improvements, but you%u2019ve just got to go through this little orifice called a new programming model. Those orifices have always been insurmountable as long as the general purpose computing models evolve into the future.%u201D

CUDA isn't a new standard, it is a library. Doesn't impose a new programming model, just adds up possibilities. I guess something is really bad at Intel camp. Anybody with a little back ground in C can do it. Just #include and use. The learning curve is small. Like in another library.

Pat Gelsinger is a quite respectable fellow (no irony or sarcasm here), but already stated that AMD64 will never be a standard, and if i recall correctly he announced the 10Ghz CPUs back in the days. I hope History repeats it self once more in this case.

Another silly remark by Mr Gelsinger:

The Sony Cell architecture illustrates the point according to Gelsinger. The Cell architecture promised huge performance gains compared to normal architectures, but the architecture still isn%u2019t supported widely by developers.

The Sony Cell arquitecture thrives in uncompressed, un-cached, massive RAW data. While it might be great to F@Home and even for gaming, it is completely different from the X86. You don't think programmers would be glad to return to the RISC architecture. There is a reason why DirectX and Visual Studio are so successful. They are fairly easy to use and learn and to produce decent software in a small time-frame.

Mr Gelsinger have my respect, due to being a visionary, and for being around for so long. But like every other professional, he makes mistakes. Dunno why, but sometimes silly. I guess we all do.
 
^I have to agree with him on Cell though. It is not as easy to write for and I have a good example of this.

VALVes Gabe Newell stated he did not like the PS3 due to the Cells coding and how complicated it was to write for Cell. Due to that VALVe licensed the Orange Box to EA and EA did a piss poor job of porting it and doesn't support it with any updates thus far.

Now in my opinion, if one of the biggest PC game companies does not like it and is not willing to put the man power to write for it that has to mean its a too complicated standard to pick up.
 
I've always preferred PC gaming to console gaming, but I can understand why they're popular with many.

In a value sense the PS3 is not only a games console, but also an entertainment system with a Blu ray player. I've only played a handful of titles and for the price of the system, the graphics look pretty good.

The flaws with console based gaming, is that I find many of the games can be pretty brain dead. I would rather pay the expensive PC hardware prices and have a range of games that I prefer to play. I don't wish to flame, but PC gaming is more intelligent in my opinion.
 
According to http://www.overclockers.com/index.p...ntels-industrial-ideology&catid=53:editorials:

"Larrabee no more needs to beat the highest-end nVidia or AMD GPUs in the hottest games when it comes out than the x3100 needs to beat nV/AMD notebook offerings today. What it needs to do is be (profitably) a good deal cheaper for X level of performance than the nV/AMD offerings.

To put it another way, if Intel can provide the same graphics power with Larrabees at half the chip cost of even lower-end nVidia/AMD GPUs, they may not get great reviews, but they'll have a real winner on their hands. Even if the first generation or two or three of these Larrabees aren't so good, so long as they're decent enough to be competitive in the low- to medium end, they can gut nVidia/AMD's lower-end product lines by making them unprofitable. Believe it or not, nV/AMD would go quickly broke if all they could sell was $500 video cards. The graphics industry works on economy of scale, too (though not as much as the CPU industry). High-priced items get the attention and make the profits, but the far larger lower-end sales pay the company's bills. If all nV/AMD could sell were high-end cards, that $500 video card would become a $1,500 card pretty fast, and the companies would price themselves out of existence. "

So apparently Ed thinks Laughabee will be cheaper than either of AMD or NVidia's low- to mid-tier offerings.
 


No, he didn't say that.


He said that is all Intel need it to be to be a justifiable success.
 
I can see Int-hell Larrabee being a Low end component...Reading the article and Judging by the Tech specs.
Intel may have to enter the budget segment....Problem is Larrabee could potentially be worse than ATI and Nvidias low end GFX cards. It could be that Intel has only one place where Larrabee will go..........The Trash Can

Perhaps the Author is right when he says

science project that got out of hand

AMD4Life!!
 
^Ladies and gentlemen his contribution to the discussion. Nothing but anti Intel this and that.

BTW only BM calls Intel Inhell...your Int-hell remark makes it seem like your are BM.

Either way he always is a contributing member of the THG society, no?