Watch Intel's First Demo of Larrabee GPGPU

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

enewmen

Distinguished
Mar 6, 2005
2,249
5
19,815
[citation][nom]Larrabee_Core_iFail_inside[/nom]I don't know about you guys, but I found AMD's year-old demo of raytracing to be far more spectacular:http://www.youtube.com/watch?v=7fz [...] ature=fvwpIntel's demo just looked pathetic. They are GPU-cursed, it's unreal...[/citation]
The demo looks good, but I didn't see anyplace that the demo is actually real-time ray-tracing.
But yes I agree, the Intel demo was more like a proof of concept. I also think as people start seeing more real-time ray-tracing, photo-realistic rendering, and general-purpose processing, they can appreciate it more.
 
G

Guest

Guest
enewmen: Feel free to google it and prove me wrong, but I'm 95% certain that was done in real-time, if I remember correctly.
 

enewmen

Distinguished
Mar 6, 2005
2,249
5
19,815
I'm sure it's done in real-time, it's just not ray-traced. The technique that the Ruby demo used looks like "Scanline rendering". There is a BIG difference.
http://en.wikipedia.org/wiki/Rendering_(computer_graphics)
 

jacke

Distinguished
Sep 29, 2008
9
0
18,510
Looks really interesting. Heard some comments from Nvidia towards the Larraby, not so nice though.

Jaak Dell
 

iocedmyself

Distinguished
Jul 28, 2006
83
0
18,630
ATI already showed their 4870 was more than capable of real time ray tracing, and given the 5870 ups the computation power from 1teraflop to 2.82 teraflops, gee i guess it's far fetched that it could keep up.

As i've said in many posts in many forums larrabee just doesn't have the computation power to match up. It would need 48 cores at 2ghz a core just to even the processing power of a 5870. When their 24 core version was already drawing upwards of 300w, where the 5870 draws a max of 180, doesn't seem like to much incentive.

So what is Intel has been pouring obscene amounts of money into R&D? It's far from the first time. After nearly $20 billion. that's $20,000,000,000 over a decade they ended up with the itanium. It sold 8000 units and is THE failure of the hardware world.

The timna, development began in 97, was announced in 99 was to be the first Cpu with on die gpu and IMC. It was launced, recalled, redesigned, unresolved and unpursued. Intel Failed to succeed in that endevor until they were able to arrange a cross license agreement with AMD to use AMD's design for 64bit code, IMC, and HT bus.

It's a niche project in their attempt to have thier hand in every aspect of a computer platform. Given that it's intel the larrabee will probably be 10x the cost of what the performance is actually worth.
 

speedemon

Distinguished
Mar 29, 2006
200
0
18,680
He said Helicopters, How many times do we have to tell you noobs. Its an ANANSI. I think i saw a bumblebee and tormentor in the background too.
 
G

Guest

Guest
Check out the original terascale article and the specs provided:

http://www.tomshardware.com/news/intel-80-core,4306.html


So, we can agree that Intel lies? .95 volts at 3.16ghz times 80 cores with only 100 million transitors for the original terascale.

Look at (estimated)Larrabee today: 32 cores @ 2.0ghz @ 2 billion transistors and supposedly 2.0 teraflops, but, when you compare to Nehalem, that's 62Gflops per core, or nearly what Nehalem is. Since they are essentially Atom cores, that means that somehow there's a 30x performance gain per core.
 

annymmo

Distinguished
Apr 7, 2009
351
3
18,785
[citation][nom]Yuka[/nom]So... Larafail ain't so fail after all. That's good to know.If i heard correctly on the video, they re-rendered a scene (map?) using Raytracing alone for lights and reflections adding another process for it, that's pretty impressive... Too bad it has so low FPS for a *gamer* to care. It has some impressive capabilities for rendering though, hope Intel puts more juice for gamers to care.[/citation]

Those lights and reflections can be baked for simplified game models.
For making games, 3d content, this is a very interesting development.
Such a shame they took x86 and not x64.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
realtime ray tracing is simply not easy. Ray tracing in general is not easy. (and I mean easy on hardware) so to see anything even remotely "real time" is amazing. It was not that long ago that "real time" and "ray tracing" were thought to be mutually exclusive...

...as far as a demo goes, that was really cool. Regardless of how it "looked" subjectively.
 

nachowarrior

Distinguished
May 28, 2007
885
0
18,980
the funny part is, no game de\/ currently lists this as a supported gpu... why would i buy one? i wouldn't, and I won't. Just before the release of the 48xx series of cards intel had some rediculous amount of performance gains to go before they e\/en came close to catching up... now the hd 5870 is out, blowing most cards out of the water, and intel still has no product, making yet another 1.5x leap in performance that they ha\/e to challenge with a non existent product. I think intel is going to fail big time with their "gpu" simply for the fact that it's something they ha\/e no idea how to do. their graphics now are soooo awesome... i ha\/e confidence that they will do well! hahahaha /sarcasm.
 
Status
Not open for further replies.