Larrabee versus ATI/Nvidia are we getting more choices?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Upendra09

Distinguished
So Larrabee is coming soon and Nvidia and ATI don't seem to mind, does Intel actually have a threat to pose or are they just going to be advanced integrated graphics?

And what does Larrabee have over Nvidia/ATI? and vice versa

I know this might be trolling but it gives me some good info
 
Solution
Larrabee could be very interesting especially when it comes to real-time ray tracing and being able to upgrade to any future DirectX versions without having to buy a new video card. These are the things that kept me focusing on Intel Larrabee but the Larrabee's performance might be slower than current ATI and NVIDIA video cards. It would surprise me if Intel Larrabee really allows a real time ray tracing and if there will be real time ray tracing games in the near future that will put rasterisation graphics era to an end. (I know that it is unlikely to happen soon). However, I heard that Intel will release PC Game Project Offset/Meteor which could potentially enable real time ray tracing and this will act as a demo to show what Intel...
He meant it's a DX7 level example of Raster graphics, reflections, shadowing and depth can be done far better than that example shows.

You can show all the same things in a a still from both really, where they differentiate is the level of maneuverability you can do with RT versus Raster as the bodies get more complex and things like varying window transparency change by angle and approach.

Right now Raster graphics look much much better REAL time, because the processing power for RT is so demanding, so Raster tricks help give us awesome looking images (let's say 93% realistic - accurate) at 30 fps, whereas RT graphics can give us 99.44% accurate images with few visual errors, but at 30 seconds per frame.

Showing the images you did undersells Rasterization as much as much as only showing RT-Quake would undersell Ray Tracing.

To the OP, Google is your friend for the questions about 'what is...." try to move the conversation along with "what do you think of..." instead and google the stuff in the answers you don't get. ;)
 

Techno-boy

Distinguished
Dec 5, 2008
357
0
18,810
I remember that there is also the Ray traced "Ferrari Car" which looked amazingly real but I forgot the link. I think that it is also on Youtube if I am correct.

I hope that something so real like that will exist in video games in the near future. That was awesome and incredible! :eek:

Also, 60 fps should be enough since LCD monitors will not allow us to go above 60 fps and human eyes will not regonize the difference between 45 fps and 60 fps based on eye doctors' information. I hope that Intel Larrabee will support Real-time ray tracing and giving about 40-60 fps in ray traced games and that would be enough to satisfy most of us. I also hope that Larrabee could come in around Q4 2010 to compete with ATI DX 11 cards and NVIDIA DX 11 cards in time instead of Q1 2010 so we could compare them and see the advantage of Intel Larrabee over ATI and NVIDIA's offerings with DX 11 video cards.
 



That is more or less true. Don't forget that nVidia has CUDA which is getting a bit of a hold in the CAD/CFD and may be even vid/photo editing market.

, people who think you can't see anything above 20-60 fps don't deserve a voice on the internet(s).
+1000000. Agreed! You can (at least I can) tell the difference between 40fps-60fps and like 120fps+.

As far as real time raytacing goes, nVidia/ATI has the best chance of going some thing (esp. AMD as they have both GPU AND CPU tech).

Another posibility is that if this takes off, nVidia could be bought off/sold to by some big company such as IBM. I'm not sure on this, but I think nVidia could get x86 license from IBM or VIA correct?
 

Techno-boy

Distinguished
Dec 5, 2008
357
0
18,810
LCD Monitors support 60 hertz which is only 60 fps max. Despite that some LCD monitors might offer 120 hertz option but you still would never go above 60 fps no matter how powerful your video card is even if it is powered by Nuclear energy. Lol! :D

Right, now I better shut my mouth or else I would never get to see anything below 60 fps. Lol! (just kidding):D
 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810
I still say this all comes down to the die hard fanboys and weather Intel can do something to make them switch. As well I think it won't happen at first, people will want reviews and benchmarks and some hands on before they will drop the money Intel is likely going to be asking.

And from what I have heard the Intel cards won't be out until Q1 of next year. By then Ati and Nivida will likely have a foothold in sales with the 300 and 5000 series. That will hurt Intel's sells quirt a bit I think; as most people don't have the money to buy another high end card.
 


Yes but that's old too, look at last year's ATi Demo day and look at the car and cut scene there with reflections, depth of field and everything, compare that ferrari to those two (which are being rendered real-time) and you don't care too much about 'playing' the early RT stuff, just hope for them to get the hardware soon that will start getting us toward parity. The latest best RT demo car I remember was a Buggatti Veyron, but admitedly that was a few months ago, stuff in this field changes quickly for both sides (just look at the DX11 demos, with full Tesselation [check the edges to show the advantage of create relief, not just parrallax illusions]).


Also, 60 fps should be enough since LCD monitors will not allow us to go above 60 fps and human eyes will not regonize the difference between 40 fps and 60 fps based on eye doctors' information...

Stop now, you're becoming one of those people who should be disconnected from the internet(s).

Also, Larrabee will not usher in real-time RayTracing, that will be a while out for anything but small resolution/object games/demos. Real-time IMO is about a 2012-15 thing depending on requirements. Just think of the RT workload of a foliage infested Crysis or Oblivion, it's not going to be handled live by Larrabee or anything near term.

Don't confuse yourself, Larrabee is not all about RT, and it's doing to be still primarily for low end usage at first, JDJ is talking about future iterations, not about current ones, and you will likely see alot of short cuts, tricks and DX/OGL fallbacks for this first generation hardware. But having abother entrant into the market with the clout of intel helps to shake things up and make the other two work harder, and hopefully this is a goof thing, as long as there isn't only one left standing.

Anywhooo, I gotta go to the lake, just thought I'd pop in for a sec, Ciaola! :sol:
 

Upendra09

Distinguished
The ray traced pics look extremely real, especially JDs.. but how do i know that's not a photograph from a 12.3 MP camera?

not that i am doubting you, but my point is where does ray tracing have to end before everything looks so real that you might as well take a picture of everything and make a game out of that?
 
Before I go, just something to look into which has always been seen as the bridge between Raster & RT, and which I think may be something that the compute shaders and Larrabee will work on and start to implement is Ray-Casting, Think of it like the missing link between the two. Early use of Ray-casting was limited due to power, but now it should be easier to do, especially with the very vector-oriented processors in unified GPUs that are massive in number compared to the old vertex shaders. When combined with deferred shading on the setup rather than Z clean-up, I think you're going to see more of that implemented nearer term to achieve RT type images in Raster type speeds. The HD4K did a demo earlier in the year, and it was pretty impressive, although misnamed as true RT.

But like JDJ mentions it's still dependant on all the devs and here the money and motivation are.
 


OK, one last one, RT can be manipulated, the picture cannot. In order to play a game with 2-4 Mega pixel images of every View, you would need Terabytes of data to play a game since even rotating around an object with the exact same spacing, through 360 degrees and sayinging you didn't require images for even fractional degrees, would require 360 pictures for each object in whatever bit-depth you needed for the game, that's an incredile load just for that one object, let alone a bunch in a room at various distances, then add effects like smoke, then you need a whole different folder/batch for pre & post explosion/fire, etc. and then you need pictures for lighting changes, etc, just impractical even for a one room game let alone a whole game world. If you create render rules (either for Raster or RT) then you don't need a ton of pictures, you can just say render this object and it's lighting based on it being a pot, with this wavelength of colour, these reflective properties, and then have it inserted into the view/reflection calculations.

BTW RT graphics are usually what you see in movies, but they are using render farms that take seconds to generate one frame, so trying to get that level of realism down to real-time on a desktop is a little difficult and the thing holding us back.

Anwyhoo gotta fly...

 

Dekasav

Distinguished
Sep 2, 2008
1,243
0
19,310
Just to mention, I think it took 6 hours to render one frame of "Cars" on the render farm, and Cars wasn't completely raytraced, they still used rasterization for non/less-reflective surfaces.

I think TGGA's prediction of 2015 is reasonable, at least for the first real-time ray-traced things to come out.
 

xaira

Distinguished
i think i read somewhere that lrb is roughly the power of a current gtx285, and when it releases, the 485 ight already b available, ray tracing looks pretty kool, but i was under the impressio that the main reason for lrb was because intel saw that the cpus days as the pcs powerhouse was comming to an end with gpu based processing taking over, whatever is the reasn, more competition in any market is better for the consumer.
 

xc0mmiex

Distinguished
Dec 3, 2008
321
0
18,790
as much as i love ATI, i hope Intel really does some damage with the first series of LRB... if they don't really stand out from the other two there is no reason to switch... they dont have a fan base... the have to flat out outperform and out price ATi/nvidia to get their name in there.... because us, consumers really dont care whether its red, green, or blue... as long as it plows through Crysis at a reasonable price
 
Itll be interesting, as nVidia carries a huge marketing name with it, and it shows, especially with the 4xxx series, and the slow inroads ATIs made with it
Its basically controlled the pricing, led in that department, and doesnt dhow alot for all theyve done.
Then you have the ding dong ding ding thing, and Intel owns there
 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810



will be hard to out price ATI while still offering a great card. And again this is Intel...not the best known around. And the taste of their last failure of a card is still strong in the mouth of those who know better.

Should be fun to see what happens but I know I am sticking to ATI unless the performance AND price are better becase if even one of those fails I still want to support ATI as a company I believe in. Offering both power and price.
 

leon2006

Distinguished
Its a wait and see how these new hardware will pan out.

With ATI and NVIDIA pushing out new generation card in 6 or 8 months Intel solution won't necessarily address the high end game applications.

Intel will strengthen its hold on average or mid range application.

Math intensive calculation will provide a great advantage for Intel. Users don't need to pay for additional Physics card. Its already with the CPU.

Software support will play a big role on this.