Larrabee: Intel's New GPU

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

justaguy

Distinguished
Jul 23, 2001
247
0
18,680
I don't know what the 5-year-old Intel GPU would be, but I'm calling BS that somebody has a modern ATI GPU that can't handle 2D. I'm sure there's some modeling apps out there that require nice hardware, but then you're not talking 2D anymore. I guess I could be wrong, but I don't think there's a modern ATI retail product that's not up to what Intel was doing in 2004. Actually, strike that. That statement actually cannot be wrong.
 

azxcvbnm321

Distinguished
Oct 13, 2008
175
0
18,680
I have an 5 year old Intel GPU in my PC at work and it is more than 10x faster at moving 2D than my new ATI card at home and plays some sorts of games better.

I am also very skeptical of the 10x faster statement. And as for games, are they text only games? It just seems impossible that a ATI4870 would be beaten by a 5 year old Intel GPU, 5 years is an eternity, I believe my Nvida 7600GT would be the fastest card on the market back then, offering unheard of performance for Quake III and Unreal Tournament. I admit I'm no expert here and could be wrong, but it seems impossible to me.
 

Spathi

Distinguished
Jan 31, 2008
75
0
18,630
"it seems impossible to me"... That's why they get away with it... nobody thinks it can be slower.

I just checked it is 20x faster considering the ATI card is overclocked. I think the ATI4870 has the same performance for moving 2D as the 3850.

Continuum (aka Subspace)
Intel GMA 900 (2004)
260-326fps
HD3850
30fps-60fps (2008)
 

Spathi

Distinguished
Jan 31, 2008
75
0
18,630
oops I meant "I just checked it is 10x"

I missed the 1 ;oP

I asked a lot last year on various forums and nobody could give me a good answer, so I did my best to learn about it all and find out why for myself.

RE: "Nobody uses 2D and Bitmaps anymore..."
? wtf
Windows XP... move a big window fast with your glorious 4870.
 
G

Guest

Guest
I am kinda surprised they didn't mention anything about ray tracing. Maybe that is all software side stuff, and will come in the next article. But, as I recall, Larrabee was being billed as a ray trace monster. I would think that the writer could show some hardware evidence of this intel propaganda.

It looks like it is going to be an interesting design. I think it will advance graphics in a new direction. The same way that the Cell processor did. Although, after this article, I'm finding it hard to believe that it is going to be the world beater that the Intel CEO's tout it as being.

When push comes to shove. When Larrabee makes it big splash, next to Nvidia and ATI's next gen processors, it will be interesting to see where the performance chips fall. All in all, worth my time following in these articles.
 
G

Guest

Guest
So what does all this means to an end user? Will it run crysis in 1920x1080 on maximum detail with fps above 30 or is it another S3 Chrome? (all talk no performance)
 

I800C0LLECT

Distinguished
May 16, 2007
262
0
18,780
"and heavy hitters who seemed to think that 3D acceleration was just a gadget (Matrox, S3, and ATI before AMD purchased it). "



Ati before AMD purchased it? You've got to be kidding me. ATI was in the game a few years before that happened. Don't let AMD take any credit for that.
 
G

Guest

Guest
i dont think this is gonna be a breakthrough at all, they cant just come from day to night to a new revolutionary technology capable of beating out all the competetition, i mean is intel indirectly calling nvidia and ati dumb? they specialize on making graphic cards, is like saying a chef telling a doctor what to do, i mean intel is not gonna invent a breakthrough so big that will colapse all the other companies, its like saying that nvidia is going to release a so powerful procesor that will beat intel's core i7. People dont be hyped about this, i really think it will not outperform in any way today's nvidia and ati products.
 

gpuguru

Distinguished
Mar 31, 2009
2
0
18,510
I agree with Armistitiu that Larrabee with most likely be a flop in terms of GPU. I think it has some very interesting multi-CPU like possibilities or the fact you could offload computations to Larrabee and still have the CPU to do other things has some very nice potential. I don't know how viable that will be when AMD introduces Fusion where it could dedicate a CPU core to graphics and leave two for other processes. Time will tell.

I do have some beefs with the article however. I think it is a valid attempt to talk about Larrabee but it is written as to have the reader think many things are new to Larrabee that are not in other graphics chips. Vector plus Scalar for vertex shading, visualization of memory and caches, bidirectional ring architectures, sequencing, arbitration, etc. are not new. R500 was Vector+Scalar and R600 had all the other parts as did Nvidia. SMT or the process of sequencing and thread control is inherent in command processing, arbitration, and branch prediction are all part of current architectures.

The author states that instead of predicating, Larrabee can execute both "of the program without loss of performance..." and "this is more efficient because it avoids the risk of wrong branch predictions." Of course there can be a loss... you are duel executing something that might not be necessary at all and if you execute when not needed you lose the cycles calculating a Zero answer. Predication is important during the entire process and to state otherwise is foolish. Z culling is exactly if-then-else in nature as you are determining depth and whether texture, lighting and such is needed where early predication is most efficient.

My favorite is the AGP graphics card image on the first page... enough said.
 

sonofliberty08

Distinguished
Mar 17, 2009
658
0
18,980
look at that card , is ugrly , oldschool design look , what kind of heatsink it use ? we got heatpipe now , what kind of memory chips it use ? we got DDR5 now , wtf !? only one display output ? and it is a analog vga , we 2 digital DVI + HDMI now , u call it highend card ? don't make me laugh .
 

You know thats a picture of Intels first video card from 1998-1999 right?
 

gpuguru

Distinguished
Mar 31, 2009
2
0
18,510
Yes that looks like a 740 AGP card. GMA is built using the technology in the 740 but, and this is a big but, it still does not mean Intel will have a leg to stand on for the first several rounds. Gesher does not look to be much more promising than Larrabee in terms of graphics performance. ATI moved away from Vector plus Scalar (so did Nvidia) to pure scalar because you can get better throughput and fill more units (ATI being super scalar). Yet here in this article it is touted as some great unique feature.

As for having new information, this article from almost 2 years ago had as much info (http://www.tgdaily.com/content/view/32282/137/1/1/). This was back when THG and TGDaily were working together. I give them props for trying to make the information fresh again but the article has too much love for Intel and less dialogue with Intel PMs, Engineers, or Technical Marketing type people. I would love to see the author talk to an Intel graphics DevRel (if they even have anyone filling the role of developer relations yet).
 
G

Guest

Guest
All of this reminds me of one thing.. Fusion anyone? AMD's vision of streaming computing? Only Intel made this the other way around.

"What’s more, despite the flexibility GPUs have gained, their functionalities remain heavily oriented towards raw calculation. For example, there’s no question of performing I/O operations from a GPU. Conversely, Larrabee is totally capable of that, meaning that Larrabee can directly perform printf or file-handling operations. It’s also possible to use recursive and virtual functions, which is impossible with a GPU."

And, what I/O operation you want to do with GPU? Printing fax? Create virtual RAM on hard drive? Uh.. phleazzzz.. About recursive function, doesn't folding@home use that function. Vijay Pande did say that he need GPU to execute complex branching and recursive functions before folding@home can run on GPU. Because as you said, GPU still cannot utilize that and f@h can run on current GPU, be it ATI or nVidia, then Vijay Pande must've been lied. Oh, and they can hardware encode video, too.. Badaboom and AVIVO are also a liar then. It was not GPU encoded video, it should be Intel CPU encoded video because GPU cannot execute recursive functions.

"A second point is that the choice of CISC has advantages as well as disadvantages. while RISC instructions are of similar size and constructed in the same way to make decoding easier, CISC instructions are of variable size. Also, while decoding is complicated, x86 code is traditionally more compact than the equivalent RISC code. Here again you might tend to think that factor is negligible, but in this case, these are processors with very small caches, where every kilobyte counts."

Really? Can you find a CISC processor these days? LOL. A complete BS. Since AMD pioneered CISC wrapped RISC on their K6, there is not a single CPU RISC or CISC anymore in x86 arena. Even IBM who pioneered RISC architecture uses complex vector instruction (read: CISC) for their Power processors. What kind of stupid thing did you thing Larabee would CISC? That is the most stupid idea I ever had if you think that Larabee will be a CISC processor.
 

sonofliberty08

Distinguished
Mar 17, 2009
658
0
18,980
[citation][nom]nukemaster[/nom]You know thats a picture of Intels first video card from 1998-1999 right?[/citation]
yea...Intel allways sux in graphic , the card just endup uncontinued , that's why we can't find any intel graphic card with heatpipe , GDDR5 memory , dual link DVI + HDMI ouput on the market today . the upcoming larrabee won't be far from that too , maybe will be the same , lame and discontinue ... and the name "larrabee" are ugly too went u compare to the name of "Geforce" and "Radeon" that are cool and awesome name . :p
 

option350z

Distinguished
Apr 3, 2009
22
0
18,510
[citation][nom]crockdaddy[/nom]I would mention ... "but will it play crysis" but I am not sure how funny that is anymore.[/citation]

Still funny to me, if your looking for an answer.
 

exbliss

Distinguished
Apr 8, 2009
1
0
18,510
Very good article! i actually hoping, i thought it could be something more like the NVIDIA ION.. now this sounds to be an ambitious and expensive approach.. Hope it turns out everything well for the good of everyone specially the consumer.
 

critofur

Distinguished
May 27, 2007
41
1
18,535
I'm tired of Windows. What I want is top notch Linux drivers which don't require a modified kernel, are open source, and, are not a pain to install.
 

critofur

Distinguished
May 27, 2007
41
1
18,535
The next generation of gaming consoles (Playstation 4) will finally have "photo realistic" graphics - meaning, game graphics will just as good as the quality your eyes see looking at the "real world".

Who will deliver this technology?
 

amd freak

Distinguished
Apr 20, 2009
1
0
18,510
I hope this is just a late April foolz joke. lol
Intel doesn't stand a chance against nVIDIA or AMD GPU's.
IMO this is a very BIG mistake but it's Intel's money and they have plenty to throw away. I hope that AMD CPU's will one day blow Intel's overpriced crap off of this planet. I wouldn't waste a dime on anything that says' Intel on it anyway! AMD RULZ!
 
G

Guest

Guest
Well, some results are probably given here :: http://www.techarraz.com/2009/05/16/intel-larrabee-51-things-you-did-not-know/
 
Status
Not open for further replies.