Intel's 'Larrabee' to Shakeup AMD, Nvidia

Status
Not open for further replies.
I stopped reading the article. I know intel can deliver Larrabee. The question here is, IS INTEL CAPABLE OF DELIVERING GOOD DRIVERS TO SUPPORT THEIR HARDWARE? Me thinks well GMA X3500, GMA X4500, Larrabee... oh wait driver fails nevermind -_-"
 
Intel makes some decent drivers when they really want too... (They just need to get motivated)
I suspect the driver team was probably not too motivated to work miricles on any of Intel's past hardware. I mean think about it if you had to look at what ATI was doing or Nvidia and then you had the job of writting drivers for some integrated crap, you would die a little inside every day lol
 
Larrabee sounds horridly inefficient compared to GPU's.

25 cores....and still not as fast as a regular GPU? sounds like I'm gonna be putting "400 more watts and a custom cooling system" if it's going to go to my expectations,of ridiculous.
 
lets not jump to conclusions jaragon... ima wait until more information is released before i come to a descision, even if its 50 cores, as long as it gets the job done, and doesn't require large amounts of power
 
sounds like a jack of all trades product (which I guess the CPU is now, 'cept it sucks at graphics where larabee might be decent I suppose).

However, intel has yet to prove that they can make a graphics chip that moves away from a rating of "total suck"...so i'm not exactly holding my breath.
 
[citation][nom]eklipz330[/nom]lets not jump to conclusions jaragon... ima wait until more information is released before i come to a descision, even if its 50 cores, as long as it gets the job done, and doesn't require large amounts of power[/citation]
How can you not? If Intel says exactly what their conclusions are,I'm pretty much right.It's not going to be really a good GPU at all.Unless,that includes 24x anti-aliasing,16x anisotropical filtering,2560x1600 resolution,and maxed out graphics with 2048x2048 textures,it doesn't sound great at all.

I myself,want to run my system on a graphics card.Not all on a CPU.
 
Sounds like jaragon doesn't want any possible innovation in the GPU sector. I welcome it. I hope larrabee blows AMD and Nvidia on their ass.

Larrabee + lucid chip sounds like a killer combination.
 
God i'm sick of the half-assed or flat out wrong assessments given in these THG articles.

The larrabee performanced was described in the classic misleading intel fashion. It will be able to achieve UP TO 16 flops/second per clock on each core. No one seems to pay attention to the fact that they can't even gaurentee 16 flops, it could be anything between 2 and 16...so the best it could do, they have already said that the first larrabe they launch will have 8 cores ....running at full performance is...

1 Ghz core x 16 flops = 16 Gigaflops/sec x 8 cores = 128 gigaflops/sec
2 ghz core x 16 flops = 32 gigaflops/sec x 8 cores = 256 gigaflops/sec
3 Ghz core x 16 flops = 48 gigaflops/sec x 8 cores = 384 gigaflops/sec

the $270 ATI 4870xt already does more than 1 teraflop/second on a single GPU core. By the time that larrabee is launched in another year or more those $270 cards will be under $100. Intel whose has continually badmouthed AMD for their practice of putting all multi-core cpu's on one piece of silicon...along with an IMC, and the cache's, they haven't launched a native quad core chip yet for that very reason, but they won't have any issues with yields on 8-16-24-32-40-48 core larrabe chips? Along with the fact that they notoriously suck in any and all things GPU related.

Intel said they want to have a 32core larrabe out by the end of 2010, so in another 16 months they will be capable of producing a gfx card that does
1 Ghz core x 16 flops = 16 Gigaflops/sec x 32 cores = 512 gigaflops/sec
2 ghz core x 16 flops = 32 gigaflops/sec x 32 cores = 1024 gigaflops/sec
3 Ghz core x 16 flops = 48 gigaflops/sec x 32 cores = 1536 gigaflops/sec

Wow, those numbers look great huh? they'll be able to do 336 gigaflops/sec more than one, single core $270 ati 4870....within the next 16 months. IF they can actually get the clock speeds up to 3ghz, which will result in a 250-300 watt card.

The x850xt Pe is not quite 4 years old, it did around 225 Gigaflop/sec
the x1950xtx in 2006 did 400-450 Gigaflop/sec
The 3870 in 2007 did 550-600 gigaflop/sec
And the 4870 released 7 months later does 1000-1200 Gigaflop/sec

In 3.5 years the processing power of a single GPU has increased by 5 times. The RV870 is slated to launch in 2009, before larrabe, which will probably mean a processing power increase to 1500 gigaflops/sec, and will be done on a 40nm core.

The fact that intel is also going to allow the larrabe to be an undefined API makes for a 3 time gamble, trying to compete with ATI and AMD out of the gate, huge leap in multi-core density, and relying on non-existing software to get to a place where it can even compete. None of their strong suits. As it stands now games would have to be coded specifically to run on larrabe, (didn't that work out well for agiea with PhysX?)

When GPU's are becoming more and more programable, capable of off setting the CPU load, this is the complete opposite of what the industry needs in order to make progress. Running quake 4 in some bastard wide screen resolution of 1280x890 or something is hardly a breakthrough, whether it's ray-traced or not. AMD demo's realtime ray-tracing using a 4800 gpu several months before.

In the past two months news on the larrabee has changed from It's gonna provide 1.2 to 1.5 times the performance of current nvidia and ati cards, to it will perform as well or slightly slower than nvidia and ati cards but it will be low power and easier for programmers to code games for, to it won't be for any specific graphics API so it needs to be coded special to be used, to see high performance it will draw alot of power and produce ALOT of heat and oh yes in another year or so it should be performing on par with stuff that's available today, so really by the time it's realeased the cards that are out today will cost 50-75% less....but hey, they're only proven to work great alrady, and why would you buy something that old when you can pay the intel preffered customor price that's 5-10 times higher than the competitions performance equal.

New and different doesn't mean innovative, innovatie implies major performance increases, less power consumption and such. This is more like what i would expect to see from a group of speed freaks with good financing.

Kind of like someone that has the really great idea of shaving their hair off, and making it into a wig so their hair...will be a hat too! and it'll be easier to wash, and won't have to brush it or get it cut either. True? Sure, practical? Well you'd think so if you were hopped up on speed.
 
Keep in mind the term "core" used for Larrabee is more analogous to, say, a stream processor in traditional gpu, rather than a gpu as a unit. In other words, it's more accurate to compare Larrabe's 16 to a 8800gts's 128.
 
In terms of raw GPU horse power this thing will definitely suck. There is no doubt about it. Looking at iocedmyself's calculations it is already bad, but it gets even worse...Being the jack of all trades that it is will hurt performance bring the results much lower.

Furthermore, if the clock speeds are that high, the power consumption will go through the roof. For example: in order to double the clock speed you need to double the voltage (approximation), which means doubling the current, which means quadrupling the wattage. By comparison taking a 600mhz card and clocking it to 3000mhz (by increasing the voltage by a factor of 5) will have the following effects

5 times performance
25 times power consumption (squared relationship)

At that rate they could create a 1000 watt card at 45nm and it still couldn't keep up with the HD4850.

Also, with that little horsepower having support for the newest technologies will be useless. I own a 7600GS Nvidia card and even though it supports HDR lighting i never use it to help my poor FPS. BTW i am getting a new comp in a month with a HD 4850!!!

Regardless dedicated graphics will always have an advantage over integrated graphics.

On the plus side Larabee could improve the CPU side of things. But that will likely take a long time as most programs cannot make use of more then 1 core.

It will cost Intel money in the short run, but with several attempts they may get it right. only time will tell.
 
lol these aren't full x86 cores... I love all the assumptions, means the hype machine is on par with what they want. This is good for all and I hope they continue to pour it on, marketing is everything. Even if they don't have the product they make AMD/ATi and Nvidia think.
 
"Intel?s Quake 4 demonstration was mightily impressive " well that's one thing drivers going well

 
I hope that one day I can build a computer with NVIDIA x86 CPU, AMD enthusiast chipset and Intel graphics card. Then We can see how Windows 7 runs on a computer so wrong like this.
 
Dear iocedmyself,

You have obviously no idea what you're saying -- have you heard of the fused multiply-add instruction?

Do you actually know if the 4870's 1.2 TFLOPs is calculating the MAD instruction as 1 or 2 instructions?

Obviously not. And you had the gall to criticize THG for the so-called wrong assessments! bah!
 
That said, THG (and some other usual suspects) did get a few assessments wrong in the article.

"According to a recent paper from Intel, simulated Larrabee performance would have us believe that with 25-cores, each running at 1GHz, we would be able to run both F.E.A.R. and Gears of War at 60 FPS."

Have you actually really read and understood the Intel Larrabee paper? A 25-core Larrabee would be able to run both these games at MORE THAN 60 FPS, AT THE WORST CASE. If you look at the graph, average frame rates from the samples they've obtained would be around 120 FPS.
 
My god. The ray-traced Quake 4 was run on the Nehalem machine. When did Intel ever said it's running on Larrabee???

Is this the actual THG website? Or have I been DNS poisoned???
 
Who said Larrabee is jack of all trades?

and blahfdfdfd,

nVidia and ATI/AMD both have their shader cores running around 2GHz mark.

You guys make me laugh! really.
 
looks intresting, hope new info comes soon.

And holy crap, can u guys wait on the slamming its not even out for testing and ur totaly slamming it
 
[citation][nom]stupid iocedmyself [/nom]That said, THG (and some other usual suspects) did get a few assessments wrong in the article."According to a recent paper from Intel, simulated Larrabee performance would have us believe that with 25-cores, each running at 1GHz, we would be able to run both F.E.A.R. and Gears of War at 60 FPS."Have you actually really read and understood the Intel Larrabee paper? A 25-core Larrabee would be able to run both these games at MORE THAN 60 FPS, AT THE WORST CASE. If you look at the graph, average frame rates from the samples they've obtained would be around 120 FPS.[/citation]

dear stupid "stupid iocedmyself"....
do u really believe intel say "its gona be 2x faster" or simething.....how many times have they said that "our next gen IGP will be *so many times* times faster".....
 
The hard part is writing the drivers. Where are they going to get the talent? ATI and Nvidia have teams that have spent years learning how to get it right. Intel hasn't gotten it right for the G45 IGP yet.

It's a lot more complicated than a printer or mouse driver. Programmers use DirectX or OpenGL, which abstract the hardware from the programmer. The drivers must translate the DirectX or OpenGL interfaces into the actual code needed by the hardware.

They have a year or so to figure it out. Good luck with that!
 
Drivers are everything! I have seen too many S3 cards that have good hardware design but very bad written drivers. I have owned delta chrome s8 card and trust I had very bad experiences. On some games performance was acceptable but some games I can’t even start.
If Intel write drivers like till now I am not very enthusiastic, but still more competition doesn’t hurt, so I hope they make an interesting product.
 
I agree, there have been some poor drivers in the past. I hope they focus on that part also. It should be interesting to see a third big player on the market, be it for consumer pricing alone. don't expect intel to come out with a booming project over night, let them put out some stable products first, and then wee will see if it hold up to the hype. I thrust it will.
 
Most of you people are complete morons. If you hate Tom's reviews and article. What the heck are you doing on their site? You know you are only supporting their advertising revenue every time you click on a link.

zpryd said
What is the point of writing an article about a product without an engineering sample to test or at least see working?
It's all speculations and assumptions. Scuttlebutt.
It's called a review or product news you freaking simpleton. So with this mentality. we should be kept in the dark about upcoming AMD & ATI products as well. Would that make you happy
 
Status
Not open for further replies.