Larrabee: Intel's New GPU

Status
Not open for further replies.
G

Guest

Guest
With the launch of Larrabee only a few months off, here’s a look at the architecture of Intel’s much-talked-about discrete GPU. How does it look on paper? In what ways is it similar to, and different from Nvidia’s and AMD’s GPUs?

Larrabee: Intel's New GPU : Read more
 

IzzyCraft

Distinguished
Nov 20, 2008
1,438
0
19,290
Yes interesting, but intel already makes like 50% of every gpu i rather not see them take more market share and push nvidia and amd out although i doubt it unless they can make a real performer, which i have no doubt on paper they can but with drivers etc i doubt it.
 
G

Guest

Guest
I wonder if their aim is to compete to appeal to the gamer market to run high end games?
 

Alien_959

Distinguished
Aug 21, 2008
29
0
18,530
Very interesting, finally some more information about Intel upcoming "GPU".
But as I sad before here if the drivers aren't good, even the best hardware design is for nothing. I hope Intel invests more on to the software side of things and will be nice to have a third player.
 

Stardude82

Distinguished
Apr 7, 2006
560
5
19,015
Maybe there is more than a little commonality with the Atom CPUs: in-order execution, hyper threading, low power/small foot print.

Does the duo-core NV330 have the same sort of ring architecture?
 
G

Guest

Guest
"Simultaneous Multithreading (SMT). This technology has just made a comeback in Intel architectures with the Core i7, and is built into the Larrabee processors."

just thought i'd point out that with the current amd vs intel fight..if intel takes away the x86 licence amd will take its multithreading and ht tech back leaving intel without a cpu and a useless gpu
 

phantom93

Distinguished
Mar 23, 2007
353
0
18,780
Damn, hoped there would be some pictures :(. Looks interesting, I didn't read the full article but I hope it is cheaper so some of my friends with reg desktps can join in some Orginal Hardcore PC Gaming XD.
 

JeanLuc

Distinguished
Oct 21, 2002
979
0
18,990
Well I am looking forward to Larrabee but I'll keep my optimisim under wraps until I start seeing some screenshots of Larabee in action playing real games i.e. not Intel demo's.

I wonder just how compatible larrabee is going to be with older games?
 

tipoo

Distinguished
May 4, 2006
1,183
0
19,280
[citation][nom]IzzyCraft[/nom]Hope for an Omega Drivers equivalent lol?[/citation]


That would be FANTASTIC! Maybe the same people who make the Omega drivers could make alternate Larrabee drivers? We all know Intel sucks balls at drivers.
 

armistitiu

Distinguished
Sep 1, 2008
42
0
18,530
So this is Intel's approach to a GPU... we put lots of simple x86 cores in it , add SMT and vector operations and hope that they would do the job of a GPU. IMHO Larrabee will be a complete failure as GPU but as an x86 CPU that is highly parallel this thing could screw AMD's FireStream and NVIDIA's CUDA (OPENCL too) beacause it's x86 and the programming is pretty popular for this kind of architecture.
 

wicko

Distinguished
May 9, 2007
115
0
18,680
[citation][nom]IzzyCraft[/nom]Yes interesting, but intel already makes like 50% of every gpu i rather not see them take more market share and push nvidia and amd out although i doubt it unless they can make a real performer, which i have no doubt on paper they can but with drivers etc i doubt it.[/citation]
Yeah but that 50% includes all the integrated cards that no consumer even realizes they're buying most of the time.. but not in discrete cards. I'd like to see a bit more competition on the discrete side.
 

B-Unit

Distinguished
Oct 13, 2006
1,837
1
19,810
[citation][nom]wtfnl[/nom]"Simultaneous Multithreading (SMT). This technology has just made a comeback in Intel architectures with the Core i7, and is built into the Larrabee processors." just thought i'd point out that with the current amd vs intel fight..if intel takes away the x86 licence amd will take its multithreading and ht tech back leaving intel without a cpu and a useless gpu[/citation]

Umm, what makes you think that AMD pioneered multi-threading? And Intel doesnt use HyperTransport, so they cant take it away.
 

justaguy

Distinguished
Jul 23, 2001
247
0
18,680
Now we know what they're trying to do with it. There's still no indication if it will work or not.

I really don't see the 1st gen. being successful-it's not like AMD and nVidia are goofing around waiting for Intel to join up and show them a real GPU. Although there's no numbers on this that I've seen, I'm thinking Larry's going to have a pretty big die size to fit all those mini-cores so it better perform, because it will cost a decent sum.
 

Pei-chen

Distinguished
Jul 3, 2007
1,300
9
19,285
Can't wait for Larrabee; hopefully a single Larrabee can have the performance of 295. Nvidia and ATI are slacking as they know they can price fixing and stop coming out with better GPU, just more cards with the same old GPU.
 

decapitor

Distinguished
Jun 18, 2008
2
0
18,510
Is it likely that one could compile existing C or Fortran code that includes MPI for parallelization for Larrabee, or would memory addressing issues require some code modification? As it stands, porting existing parallel code to CUDA is more trouble than it's worth for many people (not all I know...)
 

PrangeWay

Distinguished
Nov 21, 2008
99
0
18,630
I worry about actualy performance. Originally last summer it was "as fast as current high ends from nvidia and amd", than in the fall with the 260 and 4870's out it was "75% of the performance of nvidia and amd's", well by the time this is out nvidia will be in their gtx 300's, and amd in their 5000's, with signficant work on the 400's and 6000's already in place. I think Intel is trying to hit a very very fast, moving target to avoid blowback.
 

TheFace

Distinguished
Jul 25, 2008
77
0
18,630
I wouldn't think these would compete on a graphics level. These seem to be more on the side of massively parallel processing, for work like supercomputing and things like the @home projects. I'm sure these will have some graphical capabilities, but I wouldn't expect them to be on the level of AMD/NVIDIA (Unless someone actually programmed them "direct to metal").
This is the only area of business that intel partakes in which it is not exactly in the forefront of the field. It has been a thorn in their side, and they will come out with something respectable, but probably not a good rasterization card.
 

velo116

Distinguished
Sep 5, 2008
10
0
18,510
I agree with the post above (TheFace). I was under the impression this wasn't a "gaming" card but something more intended for design of 3d movies etc. Someone jokingly mentioned "but will it play Crysis", and I think the answer is "no" because it's not designed to.
 
Status
Not open for further replies.