Larrabee: Intel's New GPU

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

drealar

Distinguished
Sep 12, 2008
38
0
18,530
Very very interesting. Looks like the war of integrated GPUs with continue to heat up 1 more contender. I wonder if we'll see some impact in the gaming market, since most consumer with integrated GPUs put their preference on MMOs and evergreen (can't think of other words :p) games like Warcraft 3, Quake 3, and emulators (SNES anyone?). I don't have the fact, but doesn't integrated GPUs have a bigger market share world wide? Correct me please.

-=GPU_NewBiE=-
 
Intel still has the biggest graphics market share today by far yes - around 50% total market share i believe, that is because of all the OEM systems using Intel chipsets with Intel IGP's etc - anything from Intel Atom netbooks to Dell towers etc

This also means 50% of the world may not know what "performance" means....
 

harshahorizon

Distinguished
Apr 17, 2008
52
0
18,630
this intel's Larrabee gonna completely revolutionize 3d rendering and with their massive founding ability and the knowledge they having over 35 year in field eventually they gonna succeed.dont worry about drives intel will invest millions on software optimization.so in the future intel dominate the game world like nvidia does today.intel completely destroyed amd 2 and half years ago with the introduction of core 2 duo and this time they gonna wipe out nvidia and ati(amd).

im a big fan of nvidia it kind of saddddddddddddddddddddddddddddddd

bye way another great article from toms.thx
 

justaguy

Distinguished
Jul 23, 2001
247
0
18,680
Holy moly, man! If you're not going to waste your time with the shift key, at least use the space bar after a period.

Your analysis couldn't be more inaccurate either. And to the guy who keeps on throwing out 2000 mhz 48 core numbers:
a) They're completely made up.
b) Even if Intel could build something like that, do you have any idea what it would cost? A lot. A lot more than a lot.
 
G

Guest

Guest
CELL was ahead of it's time. It's been a real product in consumer hands since 2006 with developers having hands on since 2005 (and in this article it's being compared to a late 2009/2010 product). I'm glad Intel is making new progress based on this success.
 

hannibal

Distinguished
Well Larrabee seem to be like ATI solution potence 10. When aplication has been writen for it, it can be really fast! But when there is only the general solution (read as Nvidia orientated) it's not so impressive.
It's very programable, because it's actually CPU, so we can expect to see some situations when it will be blindlindly fast and some where other solutions are much better. (like Nvidia versus ATI today, but in even bigger scale...)

But when we see real products, we will see how it really ticks.
 

vouslavous

Distinguished
Jul 6, 2008
175
0
18,680
If they make some type of GPU that will run any game out there at decent settings normal people aren't going to upgrade to a high end GPU. Most of the average consumer doesn't look to upgrades, they just want something that can run the game or application out there. If Intels GPU is like this they will take a large percentage of the market out of Nvidia and AMD.
 

backlashwave

Distinguished
Nov 24, 2005
7
0
18,510
I hope intel fails miserably with its one solution(x86) for all, though the author sounds so pro intel.There are many things which are better only on paper, i hope larrabee is one of them. Why did AMD not think of such a solution ?
 

cruiseoveride

Distinguished
Sep 16, 2006
847
0
18,980
[citation][nom]IzzyCraft[/nom]Yes interesting, but intel already makes like 50% of every gpu i rather not see them take more market share and push nvidia and amd out although i doubt it unless they can make a real performer, which i have no doubt on paper they can but with drivers etc i doubt it.[/citation]

Intel drivers are pretty good actually. Competition is healthy. I'm sure Nvidia and AMD will hand around.
 

offy

Distinguished
Mar 25, 2009
3
0
18,510
If Larrabee as exspecterd of intel will be exspensive even if its first bach of prototype chips come out. I dont see intel getting anywhere with them till 1 or 2 years after when ppricers come down and next Larrabee 2 comes out with i bet a smaller die and more core's.

I also dont think the chip can boot up an os. If Larrabee domes come out some where on motherboard there will be some duel boot system.
Iver Larrabee on start up by defolt will run a core or 2 as a cpu to get bios started or some sepret chip will run as a cpu just to get Larrabee started.

I also read some where that Larrabee maybe on a card like one of there old cpu pentiem's did way back. If thats so then thy could just be turning into gpu like cards and it does make sence if thy do that as instead of just making cpu's thy would open a mass market to what gpu cards have.
 

Spathi

Distinguished
Jan 31, 2008
75
0
18,630
RE: being expensive.

Probably true enough for a few months or maybe even a year, but in a global recession people lose jobs, so can not claim things on tax or afford as much, this forces prices down quickly.
 

zodiacfml

Distinguished
Oct 2, 2008
1,228
26
19,310
this is what i have to say too.
we should expect that the GPU won't be as good and will perform as Nvidia's or AMD's entry/mainstream level cards in the future.
Yet,this will prevent Nvidia dominating CUDA/GPU applications.


[citation][nom]armistitiu[/nom]So this is Intel's approach to a GPU... we put lots of simple x86 cores in it , add SMT and vector operations and hope that they would do the job of a GPU. IMHO Larrabee will be a complete failure as GPU but as an x86 CPU that is highly parallel this thing could screw AMD's FireStream and NVIDIA's CUDA (OPENCL too) beacause it's x86 and the programming is pretty popular for this kind of architecture.[/citation]
 

Spathi

Distinguished
Jan 31, 2008
75
0
18,630
AMD and NVidia are sometimes 5x slower than their specs because the have things like MADDx5. They also have unshared memory so are forced to duplicate objects in memory which further hurts their specs. Not many people understand, when Larrabee comes out more people might.

"So this is Intel's approach to a GPU" is simple and removes many of these problems GPU's have. I have shown that a high end Larrabee could be 7.5x faster than current cards... even if was 2x faster it would own. I also think it could maybe be more than 35x faster at moving a 2d bitmap. I don't see the point of condemning something before it even exists in the market. Problem is that these days the net is full of people eager to bag anything new.
 

faresdani

Distinguished
Feb 26, 2009
11
0
18,510
Its a very interesting card.
But if intels chose this pass?
Will the other remain on the GPU?
Nvidia, has no other choice.
AMD/ATI have more choices, because already there is AMD that makes CPUs and can implement the same thing as the larabee, and they have one of the most powerfull GPU's also, made by ATI.
Why do you think AMD/ATI cant do the same? or better, implement several x86 processors and a gpu in a single chip.
Competittion will be powerfull, between all 3.
But from what i see, Nvidia is stuck to GPU, intel is stuck to CPU, but AMD have both, Wich makes them the more flexible in the market, in trying to implement a true fusion.Not the fusion they talked about, im talking about a true GPU/CPU fusion.As amd is talking about the upcuming fusion is implementing a gpu with the cpu core in a single chip.but they will run exactly the same as they are seperated.But im talking in a lurabee architecture, with a gpu at the heart,Thats what i can call the ULTIMATE FUSION DESIGN.
What all of you think about what i said???????
 

Spathi

Distinguished
Jan 31, 2008
75
0
18,630
1. I have an 5 year old Intel GPU in my PC at work and it is more than 10x faster at moving 2D than my new ATI card at home and plays some sorts of games better. Different markets need different things. Maybe intel did fail in a noisy attempt at entering a 3d market in the past as people say, but I never noticed.

2a. You will still be able to buy a cheaper alternative, I doubt many people will look hard even if it is way faster. It is just a cycle... if someone goes broke someone will be taking their place with something better or cheaper this has been happening since the dawn of Man. NVidia and ATI were selling rubbish years ago they would have been a 3rd or 4th choice for most people until NVidia flooded the Market with cheap slow cards and took over 3DFX and then released a few good cards every now and again. Before 3DFX there was Tseng Labs who went broke and got sold to ATI, ATI got sold to AMD... see the pattern.

2b. Even if it is a better GPU, as you point out most people are sheep and by the time they realize Intel is OK, NVidea or ATI will probably pass them for a while.

3. I am not trying to be a fanboy, hell if it is crap I ain't getting it. ;oP For me it is Larrabee from "Get Smart" but who the hell cares it is a GPU not a T-Shirt!

4. I would pay $1000, but yeah I might see if I could sniff it for less. They will just set the price to create the highest return.

I am sure one of them will... maybe both... maybe one will become an Intel reseller selling a modified version, maybe Intel will tank, who knows.
 

Spathi

Distinguished
Jan 31, 2008
75
0
18,630
faresdani, Larrabee is a GPU, if it were to be used as a CPU intel would need to make a radically new chipset.

NVidea and ATI currently make "Fixed Function" GPU's meaning everytime DirectX changes you need a new GPU to use the features properly.

Larrabee has real cores so can do complex calcs without using 5 stream processors for one calc. Each core is hyper threaded. Each core can be programmed individually, so you could have a whirl pool of leaves traveling in different directions independent of each other (i.e. realistic). It has shared memory so it draws everything once only. It scales 1:1, so as you increase cores it increases speed linearly. The cores can turn themselves on and off to save power. It will be on a card and you will probably be able to get a MB with Intels version of a Hyper Transport BUS to join it to the CPU.

Like people say it could be a lot of hot air, but if you think about it logically, computers almost always get better. I think Intel saw the others locking us into fixed function gpu's which don't really get better b/c stuff is sacrificed... so they are taking a swing.
 
G

Guest

Guest
Will those processors run DirectX API calls or will it need some new API to use it? I'm curious, because I'd think DirectX or OpenGL wouldn't run out of the box on this architecture.
 

Spathi

Distinguished
Jan 31, 2008
75
0
18,630
Everything except the textures is programmable. i.e. software released as drivers by Intel. These drivers will implement DirectX API calls to do stuff like fancy stuff like Fog, Lens Flares, Particle Systems, Lightning, Shadows, Clouds, Fire, Reflections, Water, Dirt

BUT... developers could do it themselves or Intel could provide you with options like... "pick the water you like", "alter the particle system behavior", "change gravity" (I don't know if they will but it would be possible).
 

hurbt

Distinguished
May 7, 2008
76
0
18,630
In asking this, I'm making obvious my lack of technical expertise in the area, but I'm wondering if the Larrabee would be much better at processing "ray-tracing", given its x86 architecture. I see ray-tracing as part of the future of 3d gaming... but I could be wrong :)
 
G

Guest

Guest
Do i see a PS3 emulator comming our way or is that just me. From the sound of it, it is aimed more to the workstation side rather then gaming.
Its the new area of using the gpu to perform cpu functions at higher speeds with easier portability. ie. CUDA. It also sounds like this type of unit could emulate just about anything without much fuss if programmed properly. Heres a thought. Instead of having a CPU and this thing, just use one or two of these? Its a cpu as a gpu, so why not have it server both as primary functions? Have a single unit for both cpu and gpu functions. Or use two units, one for primary gpu functionality and one for cpu functionality although they are the same piece of hardware and can swap functionality if its needed? The only thing that makes this thing a gpu instead of an additional cpu is the texture module(s) so why not use it as such?
 

Spathi

Distinguished
Jan 31, 2008
75
0
18,630
Larrabee can do RayTracing sure but that is not why Intel made it, other graphics methods are still more efficient.

Something like Larrabee if not Larrabee will be in PS4, the Cell processor in PS3 is just a simplified integer CPU core with 8 glorified Maths co-processors. As Fredy suggests in the article Intel has other instruction sets in the works so Larrabee may be a stopgap for a few years. Big companies have competing teams making different things.

GPU means Graphics Processing Unit. In my mind if thats what it is for that is what it is. A Central Processing Unit is still needed to do more linear fast calculations. If you need to do x number of operations in a sequence (i.e. most software) with a bit of branch prediction, a few threads and more operations per second on one thread nothing beats a CPU.

Mind you as the processing power or GPUs and CPUs increases towards what we can see or perceive, high end stuff will start to lose purpose so they will merge. I don't think we are there yet though.

You may even see a comeback of older art, I prefer complex cartoony stuff to photo realistic and I am sure I am not alone. Even 2D dungean scrollers can be more fun than some of the games released today.
 

offy

Distinguished
Mar 25, 2009
3
0
18,510
I dont think Larrabee will be in PS4. One key point in that is cost as with ps2 the cost when it newly come out put alot of people off cos of its blueray drive, but as time goes on and stuff come down in price to compeat which could take 2 years and by then somthing elce could replace it.

With looking on CUDA forums it seems it has limmits till next update for ir comes out and i wonder what hiddon limmets Larrabee has.

Seems Nvidia and intel are big on talk about there new toys but there never say anything about there limmits, Thy always go on about them having no limmits but thats folce as everything comes to some limmit.
 
Status
Not open for further replies.