Intel's 'Larrabee' to Shakeup AMD, Nvidia

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I'm having hig hopes for this thing. Just so I can go out and buy one and laugh my ass off. Plus if it is actually crap, I'll use it as a heater in winter.
 
why is someone posting such a misleading article? sheesh. Everyone knows that intel is trying to harp up Larrabee but current graphics cards handle raytracing far better than Larrabee. Has the media forgotten that?
 
I have to point out that since the release of Core 2 that Intel has under hyped their product line. The have the best and most honest track record out of all of the major players. I personally would take them at their word at least until there is proof otherwise.
 
One of the most hilarious comment for a post ever read. keep up the good comment coming.
 
[citation][nom]travish82[/nom]It's called a review or product news you freaking simpleton. So with this mentality. we should be kept in the dark about upcoming AMD & ATI products as well. Would that make you happy[/citation]

Seeing as there has been few/no articles written about developments in Deneb/Shanghai or Bulldozer, or any future AMD products.. maybe it would. It certainly wouldn't hurt. Just drop the Intel hype machine. My god... how many paid advertis.. err I mean articles do I have to read about this worthless "new" technology? It's like Intel hired Barack Obama's press agents. Larrabee for President!
 
Dear Malovane,

You neglected to quote the first part of my post about using a different review site if you think Tom's is biased.
If you hate Tom's reviews and article. What the heck are you doing on their site? You know you are only supporting their advertising revenue every time you click on a link.
You obviously must find something of value to you here if you are actually reading the articles they publish. But I suppose you'll just say that Tom's Hardware is paying me to advertise for them. :)
 
No, you're just the obligatory troll who wanders in and says "if you don't like it GTFO". I'm just the obligatory troll who wanders in and says "This site's going downhill!"

No really, I am just a very long time reader of tomshardware, and still feel a little loyalty to it, which is why I am disgruntled about the state of their "news", which ends up being very biased and obviously has lopsided coverage. Toms' news and articles used to be very technically informative and they covered pretty much any new technology coming out. Not so, anymore. At least Anandtech is informative these days, even if the bias there is as evident.

Was anything learned in this article by anyone that hasn't been already out there? No, just another fluff piece with a sensationalist title. Toms used to be better than that. I'm disappointed.
 
[citation][nom]dagger[/nom]Keep in mind the term "core" used for Larrabee is more analogous to, say, a stream processor in traditional gpu, rather than a gpu as a unit. In other words, it's more accurate to compare Larrabe's 16 to a 8800gts's 128.[/citation]

Not really true my friend. Remember that a Larrabee core is really a die shrink of an old pentium processor with a big ol' vector unit and all of intel's newer instruction sets built on top of it.

If we remember where Larrabee started, it was when Intel, contemplating the possibilities of modern silicon fabrication techniques, started to wonder how many old pentium cores they could fit on a chip the size of a core chip. Everything took of from there.

Moral of the story is that if we expect to see power consumption scaling similar to that of modern GPUs on the assumption that a "core" is the same as a "stream processor", we would be mistaken. Larrabees cores will be actual cpu cores, pentium cores to be precise, and thus will draw the kind of power you might expect from something loaded down with general purpose registers, intels own instruction sets (sse, sse2, etc.), L1 and L2 caches, and other things you would see on a CPU. Remember that any direct comparison with a traditional GPU will be quite flawed.

The problem with that is that Intel's marketing strategy is pitting Larrabee directly against offerings from nVIDIA and ATi, a strategy which for many reasons already listed is disasterous. Larrabee will compare with GPUs of the time in neither power consumption nor actual performance.

Larrabee's supposed strength will be in it's flexibility. The fact that it is proposed to combine many of the strengths of a general purpose CPU with the rendering capability of a modern GPU. But simply saying that something is "flexible" is very vague, quite abstract, and doesn't, in and of itself, imply any real advantage when it comes to real-world performance comparisons.

The main problem will be in implementation. It will come down to how well programers view and take advantage of this supposed "flexibility." Also, there remains no margin for error when it comes to drivers. Intel will have to keep all cores fed, and manager resources well if ever they hope to get any kind of performance out of Larrabee. Drivers are especially important if this thing is supposedly API agnostic.

Also, all the programmability in the world doesn't mean anything without real world performance. Yes theoretically Larrabee could be programed in the software to run sm 5.0. It could theoretically be programmed to run sm 10.0, but this means nothing if it doesn't have the actual horsepower to run these advanced algorithms at playable frame rates.

Anyway, that's my two cents.
 
[citation][nom]rang0046[/nom]i would like to know how much intel pays toms to write an preview why are u guys are so bias towards amd amd release the 4800 u guys jump on the band wagon same thing again intel larrabee to shake up amd nvidia q2009 when amd release something better than 4800 u guys switch again to amd up until now intel hasent produce a native quad coreintel is like the damn hiena so like anandtech intel pays the buck for u tom make false coment on amd read the guy article above me bydesign u guys hipe intel for what intel is just a bully they got money and thats it [/citation]
That is so much horse shit put into one comment,I don't know where to begin.
 
[citation][nom]travish82[/nom]Dear Malovane,You neglected to quote the first part of my post about using a different review site if you think Tom's is biased. You obviously must find something of value to you here if you are actually reading the articles they publish. But I suppose you'll just say that Tom's Hardware is paying me to advertise for them. [/citation]
You just sit here and troll the sites? Also,you just look at the article's comments,yet not read them when you clearly have knowledge of the article? Bullshit statement. I need to have the ability to chop off troll's arms,that would be nice,and Tom's Hardware would be free again.
 
I'm simply telling you like it is. They have been low balling or on target with the performance of their chips. The real world apps are out their for anyone look up. AMD has not been forth right and up front on the other hand. Their chips have been defective, under performing, and late on a consistent basis. So which company should we trust more? Sure they could be hyping this, they've had a history of it. However for the last couple of years they have been delivering on their words.
 
Status
Not open for further replies.