Intel's 'Larrabee' to Be "Huge"

Status
Not open for further replies.

dman3k

Distinguished
Apr 28, 2009
715
0
18,980
Intel's Larrabee will also be a graphic library rivaling Microsoft's DirectX???

Yes! The monopolies at war! The small guys win!
 

daeros

Distinguished
Jan 29, 2009
34
4
18,535
"...Larrabee may be close to 650mm square die, and to be produced at 45nm. If those measurements are normalized to match Nvidia's GT200 core, then Larrabee would be roughly 971mm squared--hefty indeed."

I'll say; 971mm is ~38 inches. The editing on Toms is certainly not what it used to be.
 

td854

Distinguished
Jun 7, 2009
107
0
18,680
[citation][nom]Daeros[/nom]"...Larrabee may be close to 650mm square die, and to be produced at 45nm. If those measurements are normalized to match Nvidia's GT200 core, then Larrabee would be roughly 971mm squared--hefty indeed."I'll say; 971mm is ~38 inches. The editing on Toms is certainly not what it used to be.[/citation]

Wouldn't 971mm squared be roughly 31x31mm...?
 
G

Guest

Guest
[citation][nom]td854[/nom]Wouldn't 971mm squared be roughly 31x31mm...?[/citation]
My thoughts exactly - the readership on toms is slipping a bit recently too..
 

apmyhr

Distinguished
Apr 23, 2009
258
0
18,780
[citation][nom]dman3k[/nom]Intel's Larrabee will also be a graphic library rivaling Microsoft's DirectX???Yes! The monopolies at war! The small guys win![/citation]
By "small guys" I assume your talking about us consumers. How do we win if game developers have to spend more money and time developing their games to work on two competing standards?
 

apmyhr

Distinguished
Apr 23, 2009
258
0
18,780
I think Tom's measurements of the size are based on flawed assumptions. If Larrabee is not released until 2011 (a full 2 years from now), I strongly doubt they will be producing it with 45nm core. More likley, it will be 32nm core. I'm not going to try to do the math for fear of being powned by the next comment, but I'll go ahead and assume that would shrink the chip by a lot.
 
G

Guest

Guest
I would just like to say that I was part of that .1% of people who called bullsh1t whenever Larrabee was the "Terascale Project", and they claimed to get 1 tFlop performance out of a 65w, 200million transistor chip... The key to catching these things is to assume that everytime Intel makes a ridiculous claim, that they are just lying to try to sell a product...
 

Ciuy

Distinguished
Feb 26, 2009
565
0
18,980
hahaha, "Tobe HUGEEE" Now i get it :)))

anyway by 2011 we`ll have something new and it will be called XCGPU, a 1cm SOI incorporating 60x 495GTX+++ plus 120 8890IceQ9+ in Xfire with 221 i9 Intels and 223 Phenomenom XVI CPUs and will be able to play Crysis at a wooooping 60+ fps on an anti-mater Screen of 80000" diameter. And it only needs 128 SD Ram to work so it`s not a cost burden. Work with a Nokia Power Adapter. Get the optional ThinkPad so you move everything in Windows 9 without moving a muscle. Also recomend a gun to shoot ureself after buying it cause its out of this worlddddddddd.

 
32 cores @ 45nm , 600+ mm squared sounds about right. Power shouldnt be a problem, but it depends on how high they crank the Ghz.
Im hearing theyll be having trouble with getting the drivers to work in all games, meaning alot of the older games wont work so well.
Doing everything in SW may cost them in some games as well, and itll be interesting to see when their SW resolve wins, and when its alot of latency.
As for the libraries, eliminating DX etc, DX itself is moving away from a HW fixed scenario, so, by then (2011?) , the new DX may be totally library/SW dependent anyways, which coincides with the articles statements, tho, I wouldnt give the credit to Intel here
 
Toms Bullshit Hardware - same news bs as the inquirer - who knows what to expect

They fail to mention here that this baby isnt primarily for gaming etc - more scientific/ray tracing etc - alot of extra horsepower to cram into a system/workstation/server
 
While this may be true, its cgpu ability may turn out to be huge, tho it wont be alone in that, Intel itself is promoting LRB to be a rendering device, and isnt a true RT HW.
To spend all that money, and sell for just a few niche markets doesnt make sense, since, as I said, they wont be alone to dominate, nor should we discount nVidias and ATI's response in this particular area
 
Basically, its the same argument nVidia fans use, but in reverse. Theyll say, its got phsx, gpgpu abilities, but what really matters there, and here is, how well does it play games? Thats the market, and therell be alot more monies to be made there than gpgpu, for now anyways. This could change in the future as we see fusion happen between cpu and gpu tech, but just like MT, it takes time
 
Status
Not open for further replies.