Larrabee versus ATI/Nvidia are we getting more choices?

Upendra09

Distinguished
Mar 4, 2009
3,174
0
20,960
52
So Larrabee is coming soon and Nvidia and ATI don't seem to mind, does Intel actually have a threat to pose or are they just going to be advanced integrated graphics?

And what does Larrabee have over Nvidia/ATI? and vice versa

I know this might be trolling but it gives me some good info
 
Yeah, last time Intel tried their hand at dedicated Graphics cards they flopped, and they flopped big. Of course this time they actually intend to put some effort into it and not do it half assed. Anyway I wouldn't expect too much out of Larrabee. At best it will be adequate. What's important is what Intel learns from it and what they put into what comes after.
 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810
7
I think that these cards are aimed more at people who will buy them simply because they know the brand. The kind of people who either just started to get into computers or are buying from an OEM like Dell or HP.

I am not saying these cards won't be powerful, no one an say for sure, but I think they would have to be so amazingly powerful or bring something big to the table in fetures before they win over any of the ATI/Nividia fanboys.

Then we have to also take into account this Intel we are talking about, they are not the best at pricing so when it comes down to it we could see more people sliding to ATI simply because "oh look another overly priced company" I don't think they are worried because they know people will buy their cards and even if Intel does come out with something amazing ati and Nividia will already have you hooked with their new lineup.
 

Techno-boy

Distinguished
Dec 5, 2008
357
0
18,810
10
Larrabee could be very interesting especially when it comes to real-time ray tracing and being able to upgrade to any future DirectX versions without having to buy a new video card. These are the things that kept me focusing on Intel Larrabee but the Larrabee's performance might be slower than current ATI and NVIDIA video cards. It would surprise me if Intel Larrabee really allows a real time ray tracing and if there will be real time ray tracing games in the near future that will put rasterisation graphics era to an end. (I know that it is unlikely to happen soon). However, I heard that Intel will release PC Game Project Offset/Meteor which could potentially enable real time ray tracing and this will act as a demo to show what Intel Larrabee is capable of.

Another interesting thing about Larrabee is that it is a video card for general use and allows the users to program it for specific purpose like there might be a software for Intel Larrabee's functionality rather than having fixed functions that are permanent and cannot be change like the ones with ATI and NVIDIA.

Anyway, lets hope that it will offer many new good things that would benefits us.
 

cjl

Splendid
I'm really optimistic about Larrabee actually, and if it performs comparably to the higher end cards when I'm looking to replace my 4870x2s, I'd definitely try it. Right now, it's hard to say though, since there hasn't been a ton of info.
 

Helloworld_98

Distinguished
Feb 9, 2009
3,371
0
20,790
4
^ nvidia would be out of the business because they have nothing to fall back onto, unlike AMD, because if AMD gets owned they can fall back on their CPU's, chipsets and IGP's. Nvidia can't do that as they've pretty much been kicked out of the Intel chipset market, and have no x86 license.

x86 seems like a downgrade, but because larrabee is so powerful, it should be more of an upgrade, although the rumoured 300w TDP could put off customers, because who wants essentially a small heater in their case?
 

JAYDEEJOHN

Champion
Moderator
@ SS, naw, just a slow drawn out one
LRB is intended to do everything CUDA claims to do, and may do it better, as well as being a gfx card, thats why theres more pressure on nVidia here with LRB.
I agree with helloworld here, if indeed LRB picks up alot of x86 devs, and finds its way into a major console, theres trouble ahead for both nVidia and ATI
The difference here is, LRB and whatever ATI/AMD brings to the table in 2012ish is going die side, or fusioned, which will leave nVidia out in the cold like the ugly red heaeded stepchile
Another scenario, and the one I see as most likely is, lower classed cards will simply disappear in this scenario, as fusion happens, and leaves only the higher/high end open for discrete, tho, listening to devs lately, they all sound like making agame is becoming too expensive, and progress there will be stalled for quite some time, and itll only be the renegade rogue camps that push the bubble
The devs are already outsourcing their artworks for games, and making them cheaper this way has stalled as well, and unless the adoption of DX11, and being able to use it, and hopefully doing so reduces manhours while still bringing higher eyecandy etc, itll stall, tho I still see an end to DX9.
One positive thing I am hearing is that devs actually may start putting in better story lines with better , larger worlds, but all this is speculation, and we wont know til it happens.
Keep in mind, for every LRB sold, thats 1 less sale to nVidia or ATI
 

Helloworld_98

Distinguished
Feb 9, 2009
3,371
0
20,790
4
My guess would be,

Larrabee succeeds > nvidia is out of the business. AMD either A) has a 28nm competition which is more powerful and can execute X86-64 instructions within a year. or B) they create a fused cpu/gpu which is 75%+ as powerful as larrabee and costs 20% less.

Larrabee fails > nvidia is saved for the moment. everything else carries on as usual.
 

JAYDEEJOHN

Champion
Moderator
A lot is being put on Intels competence with LRB, since its x86.
It kills me to hear some devs say gaming has to change, its too costly, talking about DX9, praising the SW approach of CUDA and x86, while not mentioning DX compute, DX10 or DX11, or the whole process of seeing gpus slowly going from complete fixed function units , evolving into non fixed function compute shading on a much more open SW solution using DX11.
I think their point is, the process has been slow, its actually cost them, and M$ isnt really involved monetarily like Intel will be, thereby getting a better commitment from Intel, and thats why theyre wanting Intel to drive the direction of gaming, instead of M$.
Problem is, Tim Sweeney has wanted this even before it was possible, now that its getting close, he and his ilk are foaming at the mouth for it, and totally disowning the path thats made them their fame and fortune
 

Techno-boy

Distinguished
Dec 5, 2008
357
0
18,810
10


Ray tracing means rendering the graphics by using light rays. According to Physics, light rays bounce/reflect off any 3D objects we see which gives 3D object the colour and shiny 3D objects would also reflect other 3D objects too. Ray tracing will give a photo-realism graphics which is many times better than current rasterized graphics in current video games and it would look very real and this will also be a big major step in the graphics improvement. :bounce:

However, the idea of Ray tracing isn't something new and it is being use more in Hollywood realistic animation but it is not use in video games yet because it would perform very slow with current video cards and it would take a lot of power from the video card to perform it in which we later called it as "Real-time Ray tracing". Real-time ray tracing is the idea of using ray traced graphics in a real-time 3d virtual world like in video games where you can manipulate it and move around the place freely with the protagonist character but like I stated earlier, it would require a lot of GPU processing power to perform ray traced games. :bounce:

Real-time Ray tracing is something that we could not easily ignore and it would instantly change the graphic era dramatically. It is also possible that Intel Larrabee will support real time ray tracing and this is why I am keeping my eyes on Intel Larrabee. Anyway, this doesn't only depend on Intel but also on game developers to make such a real-time ray traced games. :D

I just hope that it is not going to be like right after we buy ATI DX 11 card or NVIDIA GT300 DX 11 card and then suddenly a news pop up saying that Intel release a real-time ray tracing card and it is the start of the end of rasterization graphics era and we would get stuck with obsolete rasterize DX 11 video card. :cry:


Example:

 

JAYDEEJOHN

Champion
Moderator
Bu then, theres many here that dont think things like SSAO or HDAO mean much either when it comes to eyecandy, and still want DX9 over DX10 or 10.1 or dont have much liking for DX11.
Its what the devs want to put into it
 

Techno-boy

Distinguished
Dec 5, 2008
357
0
18,810
10


I got that comparison from a blog by somebody but I am not sure what you meant by no shadowing and no depth.

Is that still a true Ray traced graphic or maybe you meant it was badly done by some noobs? :??:
 

JAYDEEJOHN

Champion
Moderator
Things to bear in mind while looking at that RT image, its too perfect.
Wheres the finger prints? Doing things like finger prints on RT is going to really cost, a smudge whatever.
Imagine a race game, with crunching and dirt flying...
 

JAYDEEJOHN

Champion
Moderator
The reflections on raster are much more difficult, especially concave scenarios, but can be done.
Silver can still be silver, and not some butt ugly gray, theres simple no depth to the saucer, the placing of a few things is poor, but hey, the cup looks ok heheh
 

Techno-boy

Distinguished
Dec 5, 2008
357
0
18,810
10


That has to do more with texture rendering or texture mapping like you can see rust on the old steel door. Even on 3D Studio Max allows those textures like dirt, rust, finger prints and so on. However, Game devs can program Larrabee to do whatever they wanted since Larrabee would allow that so it is going to be more specific and also for general purpose perhaps than the current cards from ATI or NVIDIA? Different game devs could have different specific functions from Larrabee but they just have to program it.

It is still going to be closer to photographs than the current rasterized graphics but it cannot be too perfect as in reality at the meantime. :)
 

Similar threads


TRENDING THREADS