Watch Intel's First Demo of Larrabee GPGPU

Status
Not open for further replies.
G

Guest

Guest
GPGPU Fail. When they take off the development name, will the marketing name be Intel Failchip?
 

charlesxuma

Distinguished
Apr 16, 2008
25
0
18,530
"keep simple things simple" .......... i'd like to hear more about that...also...was expecting more to it then a real time ray-tracing demo, at least an fps counter on the screen.... i guess thats part of the keep simple things simple campaign.
 
Hmmm interesting proof of concept but would of liked to see aplications something that most people use like games such as crysis, bioshock or stalker. Also programs such as 3dsmax or softidemage or arcmap.
 

tektek

Distinguished
Aug 18, 2006
186
0
18,680
So it cant play Crysis? ..ok joking aside.. this demo is not a good seller on the possibilities this could bring.. but so far i think WOW players will love finding cheaper laptops with no intigrated video cards that can play with more detail. Heavy gamers... not the time..not the place...................... YET!
 

tntom

Distinguished
Sep 1, 2001
356
0
18,780
What kind of power consumption are we looking at? I am completely sure this will never compete with high end GPUs performance wise but I would like to see a performance per watt comparison though.
 

charlesxuma

Distinguished
Apr 16, 2008
25
0
18,530
[citation][nom]tektek[/nom]So it cant play Crysis? ..ok joking aside.. this demo is not a good seller on the possibilities this could bring.. but so far i think WOW players will love finding cheaper laptops with no intigrated video cards that can play with more detail. Heavy gamers... not the time..not the place...................... YET![/citation]

who said anything about cheaper???? to me this looks like its gona be more expensive, plus i think theyd probably sell most builds if not all builds with discrete graphics only. however this does depend on how much larabee actually benchmarks, "give time, time".
 
G

Guest

Guest
@tntom: could be wrong, but I believe prior reports indicated significantly higher TDP than similar performing GPUs (which is to say, the top GPUs from a couple generations ago)
 

WheelsOfConfusion

Distinguished
Aug 18, 2008
705
0
18,980
The demo was for real-time ray tracing, not standard rasterization. RTRT is a pretty intensive task, that's why most people choose the raster route.
Of course, Larrabee will have to do rasterizing too, regardless of whether or not RTRT makes any headway.
 
So... Larafail ain't so fail after all. That's good to know.

If i heard correctly on the video, they re-rendered a scene (map?) using Raytracing alone for lights and reflections adding another process for it, that's pretty impressive... Too bad it has so low FPS for a *gamer* to care. It has some impressive capabilities for rendering though, hope Intel puts more juice for gamers to care.
 

gaevs

Distinguished
Apr 27, 2009
59
0
18,630
Actually, that's not bad, for animation rendering in realtime, i'm thinking in movies and short movies, as render times are huge, with a lot of network computers, if you can use one or two of those, that will shorten render time to days instead of months..
 

gaevs

Distinguished
Apr 27, 2009
59
0
18,630
and as it uses x86 instructions, the net renderers could recognize it as another multicore CPU, with little coding..
 

eklipz330

Distinguished
Jul 7, 2008
3,033
19
20,795
you know what, despite intel having the majority number of shares in the market, they still seem to be putting a whole lot in R&D, and even if they do mess up, they've been leading for quite a while, and unlike nvidia, they haven't been slacking... i really hope them the best

i mean they already announced 22nm for 2011, that is really impressive.
 

tipoo

Distinguished
May 4, 2006
1,183
0
19,280
It's doing way more than 50 xeons were a few years back, and you people aren't impressed? That's probably because you do not understand what you are seeing and what it entails for the future.

Rasterisation is great but for effects like shadows and mirrors it is a real mess. Processing requirements also scale linearly hence these ridiculous graphics cards. Rasterisation just lacks the realism of ray tracing. Developers have to do so much work with rasterisation to get all those nice effects and they can't do every surface.

Ray tracing engines will change all this and the developer will simply define transparency, reflectivity etc. rather than having to create them by hand.

Essentially raytracing is a more physics like approach treating the simulation like it is in the real world.




Besides, Intel made no attempt to promote this as a high-end/enthusiast product. So why is there a straight presumption that it will be? It won't. They are aiming for the mainstream market, NOT the top end. (Don't be surprised that ATI and Nvida will own Intel on the performance side in 2010...In fact, I know they will.)

Dont get me wrong, AMD's 5870 demo with Crysis on Eyefinity was much more impressive to me. That and the fact that AMD's card hits 2.72 terraflops on a single chip, while Larrabee is targeting a measly 1 terraflop.
 
G

Guest

Guest
eklipz330: I hate Nvidia, but they're doing a far better job than Intel is... Larrabee is crap compared to a first gen 90nm (or whatever) 8800 series GPU, they can rebrand those for 5 more generations and still be better than Larrabee. Larrabee is the Titanic of the digital era.

Although, if you compare the Tflops of Larrabee vs. the Gflops of Nehalem, either every single Larrabee core is just as fast as Nehalem(at a lower clockspeed), or Intel is completely full of shit. Take your pick.
 

chaosgs

Distinguished
Sep 9, 2006
823
0
19,010
Had his hands in his pockets, on a demonstration, way to go. Shoulda paid me the 6 figures he got to do that demonstration. Least i WOULD/COULD do it professionally and not talked like a robot.
 
Status
Not open for further replies.