GeForce GTX 580 And GF110: The Way Nvidia Meant It To Be Played

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Nice cards, but I think everybody is missing a really important point: why upgrade to the newest video cards when the games aren't even taxing the old ones completely? I'm running tri sli gtx 260s (a 2 year old card) and I can still max out almost every game at 60fps at 1920x1080... For the momement, I see no point in upgrading. (I love these cards... I'm definitely getting my money's worth out of these things. I thought they'd be outdated by the time they were 2.5 years old. :)
 
[citation][nom]jednx01[/nom]Nice cards, but I think everybody is missing a really important point: why upgrade to the newest video cards when the games aren't even taxing the old ones completely? I'm running tri sli gtx 260s (a 2 year old card) and I can still max out almost every game at 60fps at 1920x1080... For the momement, I see no point in upgrading. (I love these cards... I'm definitely getting my money's worth out of these things. I thought they'd be outdated by the time they were 2.5 years old.[/citation]

I've got a 3 lcd + beamer setup, and my cards (also from 2008) are definetly taxed. Try playing metro, or max the graphics on a demanding game without sli support.
 
a gtx580 would not be the answer for Nvidia as it will piss off those who bought the 480s since they will be shelved soon after... tsk tsk... Nvidia has so many get away free cards before fans get pissed..... it just shows the many failures of this current gen gpu unlike ATi which released their new 6xxx cards because their cypress and other chips DID SO WELL and are just revamping the line.... Nvidia on the other hand needs to bring a new line out just to save their failure
 
IRay GPU renderer benchmark would be nice

That is totally awesome idea! I would love that to happen.

Can you please explain a bit more in detail why not to get a quadro card?

In a bit more detail would be: quadros suck 😀

But, really they just cost a lot more to get an uninterrupted openGL performance. If your program isnt using OpenGL or isnt flagged by geforce drivers to have throttled performance, you will never know the difference between quadro and geforce card with the same chip. Now, when you get to the very high end, you're going to notice quadros have a lot more ram than gf. Without iray, all that ram was wasted even the most complex 3d scenes- billions of polygons, huge textures etc will use up to 1gb of ram. Now maybe if you need more than loads of huge textures and more than billions of polygons (hollywood? or more likely very unoptimized scene) you're going to find quadros outrun geforce due to sheer amount of memory- 4gb vs 1.5gb.

As far as I can tell, iray is the first and only gpu intensive software that uses lots of ram as it can't efficiently use system ram, so basically you need your regular system ram amount on your GPU. Here I'm saying that the bus connecting system ram with gpu is slow, not that iray is poorly written.

I dont know any directx app that is throttled by gf (Max/Autocad).

And last, but not least- if you're not doing 3d you dont need a quadro even more so. You may not even need a discrete GPU then.

How many of those would benefit from geforce cards and not only quadro?

Autocad, Max and Adobe cs5 (with hack:http://www.studio1productions.com/Articles/PremiereCS5.htm) If you dont know about it, its basically a .txt file that you open in notepad and in the last line type ine "GTX 460" for example to have you unsupported card, get supported.

ps. last year we tried using ati hd4000 series cards instead of geforce (price), and saw a similar lack of performance boost....

Personally, I'm still running hd4850. I'm a Max user, so I can only say that its working flawlessly here (Max 2009/10/11) I will upgrade to gtx 460 to get Premiere MPE support and iray capability.
 
well after reading the comments, i see a lot, and i mean a lot of fanboys waving the flags again...

My opinion, nice card, nice performance, uber GAY price...

I'll take my money elsewhere, specially for that price range....
 
[citation][nom]cigarjohn[/nom]What I don't understand is that lately, all websites including Tom's Hardware are using the new Intel 6 core processor in their test and they all seem to be using RAM frequencies over the 1333 mhz levels whereas the i7 980x will automatically downclock the frequencies to 1066 and the user has to overclock to 1333. And thats the max the processor will allow and the processor will not allow anything over the 1333 mhz barrier? And I see companies using 2000 mhz ram. Why? They all should the ram matched in frequency speed the processor will allow. Tom's Hardware, I want a response to this comment.[/citation]

Here's your response; sorry it's late--I just saw it :)

The execution cores are overclocked to 3.73 GHz here, and the memory ratio is set to a higher-than-stock data rate as well (in this case 1333 MT/s as opposed to the processor's highest native 1066 MT/s). This is done in order to combat potential bottlenecks. Thanks!

Chris
 
[citation][nom]nate007[/nom]Where is the 6870/6850 crossfire reviews ?I'm thinking Toms (Cris)? is not interested in this as this would have been up now. Oh well No point doing the article now anymore as there is more then enough coverage on this on other sites. It's a shame ,Toms used to be the 1st place to come get the Low down, not so anymore.Guru3D has the BEST review so far as to where things stand on the GRAPHICS land using SLI and Crossfire.http://www.guru3d.com/article/gefo [...] i-review/1Just picked up my 2ND 6870 today and I;m very happy in knowing I have a Video Card setup that lays the BEAT DOWN on all the other cards including the 5970 & GTX580 for allot less money , of which is going towards a new Samsung Galaxy Captive S[/citation]

Being worked on as we speak! =)
 
The GTX 580 hits the performance nail on the head for a few simple reasons. The 5970 can only be set up in a dual crossfire configuration because it has the double GPUs on one board. A single 580 has a "relatively" close performance point when comparing it to the dual GPU 5970 and when put into SLI, it absolutely blows it out of the water. Now put two 5970's in crossfire and then put 4 580's in SLI, same number of GPUs, no comparison. Although the price point for the 580 is relatively high, when it comes to GPU for GPU performance, the 580 takes the cake.
 
Status
Not open for further replies.