Larrabee: Intel's New GPU

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

quantumrand

Distinguished
Feb 12, 2009
179
0
18,680
To me, it sounds like Intel i aiming to give Larrabee the performance of its "next gen" integrated graphics chips, but also give it the edge of massively parallel computing.

When I hear Larrabee, I think Intel's 5xxx series with CUDA.
 

ph3412b07

Distinguished
Jan 9, 2008
78
0
18,630
[citation][nom]tipoo[/nom]That would be FANTASTIC! Maybe the same people who make the Omega drivers could make alternate Larrabee drivers? We all know Intel sucks balls at drivers.[/citation]

Well, I wouldn't generalize Intel's drivers as trash, the guy working on Larrabee is Michael Abrash and he's pretty experienced. Additionally, despite a new architecture to code for it doesn't have the complexity of say, a PS3. We all know how pisspoor PS3 to PC ports are. Intel is going to be able to create decent drivers.

Just a hunch, I'm thinking Intel's first line is gonna be geared towards workstation graphics. The architecture kind of lends itself to vector processing and CAD work, but when it comes to direct x and 3d game rendering, I doubt it'll be anything close to groundbreaking. While it would be amazing to see a HD4870 or GTX295 go down in smoke against Larrabee, that's not a reality. Intel has already hooked the gaming sector with it's cpu's.
 

boudy

Distinguished
Mar 13, 2009
498
0
18,790
I wonder how this is going to affect the GPU market. ATI aims at the budget market, and Nvidia aims at the high-end, so where will Larrabee stand?
 

falchard

Distinguished
Jun 13, 2008
2,360
0
19,790
I think its a mistake to make the GPU something more then just a GPU. What makes a GPU perform so well at rendering is that it solely concentrates on it. Even for the Workstation market, what you want is something that can generate polies and generate them in mass effect.
 

tipoo

Distinguished
May 4, 2006
1,183
0
19,280
[citation][nom]crockdaddy[/nom]I would mention ... "but will it play crysis" but I am not sure how funny that is anymore.[/citation]


Somewhere between as funny as small children dying and puppies being slaughtered.
 

traviso

Distinguished
May 1, 2008
18
0
18,510
[citation][nom]IzzyCraft[/nom]..but intel already makes like 50% of every gpu i rather not see them take more market share[/citation]
I'd rather see those 50% of consumers get a better graphics solution than the crap Intel pushes now, which can barely run the latest Sims in the lowest settings.

Gaming on the PC is limited to people who just blindly spend a lot or those in the know who build their own box. This is why the PC market is literally in the gutter (and why so many major companies are leaving the PC market). We need a solution for consumers can just "buy and use" and I think the System Ranking in Vista is the most important aspect of this, it's just a shame companies don't publish those numbers in their sales material or websites.

Consumers need a simplified way to gauge a machine's speed, they have no way to know the different between an Nvidia 8600 vs an ATI 4830 and that's never going to change. But tell them the CPU is a 4.2 and the Graphics are a 4.9 and that makes sense.

I've gone off topic here, but if a complex CPU can replace a GPU, you'll see more PCs become gamer friendly, and that's good for everybody. Who cares who produces the best GPU, as long as it's the best that's all that matters.

You could have used this same statement about Nvidia or ATI back when 3DFX was king. The only company that deserves #1 is the one providing the best hardware (and drivers).
 

Aerobernardo

Distinguished
Apr 2, 2006
135
0
18,680
[citation][nom]crockdaddy[/nom]I would mention ... "but will it play crysis" but I am not sure how funny that is anymore.[/citation]

Oh Man... I was just about to write that.
But probably won´t. Maybe the Radeon HD6870 (since the era of just rename a product might endure)...
 
heh now we will see the same on the gpu side like cpus - rapid scaling (90nm - 65nm - 45nm etc) allowing easy scaling etc.

Perhaps we will see "Celeron" larabee's? perhaps a Larabee "extreme edition"?

I hear with this design, to support new stuff like DX11 and so on all it is is a software (driver) update rather then a hardware update etc - that could be good!

Another interesting point - Intel would surely be working on a better scaling multi gpu setup - better then the current average of ~30% (not including 3DMark benchmarks etc, and twice the gpu cost, more expensive motherboards)

Interesting times...
 
G

Guest

Guest
The memory architecture of the Cell, is what really makes it shine. It's all about giving back control to the programmer. When doing high performance optimizations you have to second-guess cache behavior and this is more trouble than programming Cell SPEs. It also forces the programmer to think about main memory operations and start them early.

The Cell gave us the best of both worlds. A generel purpose CPU (PPE) for OS and "normal" software, while leaving the hard and trivial work to the SPEs. If the PPE had been a better performer (out-of-order etc..), the Cell would have been great in PCs (from netbooks to workstations).

I really hoped Larrabee could fill that gap. On one hand they want to make a GPU, for which the x86 and cache crap are useless, unless it is crucial to write drivers in x86 assembly :) One the other hand the design stinks of threaded programming on to many cores, for the average Joe the programmer; this will never work!!!

They could start fixing it by:

- dropping the cache coherence crap, or at least make it possible to turn it of and go Cell :)
- dropping x86 if it can reduce die size.

Now they have a GPU that can accelerate calculations as well (like CUDA). Considering how many Cell that can fit into the 1.4 bill transistors of the GT200 this could be a killer.

Adding a modern x86 cpu to the die, would make it a wonderful PC processor.
 
G

Guest

Guest
Remember when everybody was all like "OMG" at Intel's 80-core teraflop chip? At the time, I said it wasn't much of an accomplishment, because they had to come up with a special benchmark to come up with that figure, real-world performance couldn't have come close for a chip that has almost no cache, and a FSB + non-integrated memory controller(of course, the Intel fanboys declared blasphemy). Enter 2009 where this is going to be a real product, and the expectations have been watered down considerably, and if I dare make a(nother accurate) prediction, I say this will be a pretty good product after 3 more major revisions, and on the 10nm process node. The reason I say that is because this look to be the shiteness of atom times 20. (P.S. I also predicted that atom would suck).
 
Actually I do want to know if it can play Crysis so I'll ask "but will it play Crysis?" :D. Anyway I'm not looking to get over excited to Larrabee. Intel hasn't exactly encouraged me with their previous promises on integrated GPUs with their actual performance (X3100 anyone?). I suppose we'll just have to wait and see what happens.
 

Spathi

Distinguished
Jan 31, 2008
75
0
18,630
If a 1000MHz 24 cores Larrabee can play fear @ 60fps, I imagine a 2000MHz 48 core would be fine with any game for a while. If they were testing these games years ago I imagine also the drivers may be fine + the drivers could probably be coded to emulate any GPU or work with any DirectX, which is the whole point.
 

sonofliberty08

Distinguished
Mar 17, 2009
658
0
18,980
that card look ugly and old style , even the geforce 2 looks good than it , i bet it can't keep up with nVidia and AMD/ATi cards for sure , maybe is just a toy for those fanboy .
 

kato128

Distinguished
Feb 23, 2009
160
5
18,685
I'll believe the performance claims when we can buy units to test. Intel is known for exaggerating how well their graphics chips perform.
 
[citation][nom]wtfnl[/nom]"Simultaneous Multithreading (SMT). This technology has just made a comeback in Intel architectures with the Core i7, and is built into the Larrabee processors." just thought i'd point out that with the current amd vs intel fight..if intel takes away the x86 licence amd will take its multithreading and ht tech back leaving intel without a cpu and a useless gpu[/citation]
HAHA

HT(Hyper Transport) and HT Hyper Threading are different things and in fact Intel owns the HT for multi threading and AMD never implemented anything like it.

This is the BEST article I have seen in a while. Keep it up.
 

iwod

Distinguished
Nov 21, 2006
13
0
18,510
I think another advantage of Larrabee is that is it Muti Core Scalable. Comparing to Nvidia SLI and ATI Crossfire which gets less then 60% of improvement from two cards. Two Larrabee Core could have 80%+ performance boost.

I can see huge success for Larrabee if Intel decides part of its engine to be made open source.
 

Spathi

Distinguished
Jan 31, 2008
75
0
18,630
I doubt they would exaggerate that much.
When you extrapolate it, it looks cool anyway, I can't wait to see what it is really like. If they could release it last year, they would probably have got years of court cases for destroying everyone else.

FEAR 1600x1200
 

Spathi

Distinguished
Jan 31, 2008
75
0
18,630
I doubt they would exaggerate that much.
When you extrapolate it, it looks cool anyway, I can't wait to see what it is really like. If they could release it last year, they would probably have got years of court cases for destroying everyone else.

sorry my post got cut.. illegal char

Larrabee 2000MHz 48 cores
Min 240fps Average 480fps Max 960fps

ATI HD4870
Min 32fps Average 70fps Ma
 
G

Guest

Guest
It wouldn't surprise at all if Larrabee is not up to par with AMD's and Nvidias current gen offerings.

And as someone else said: neither AMD nor Nvidia are resting on their laurels, so Larrabee will have an extremely hard time facing R800 and GT300.
 
Status
Not open for further replies.