• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

Nvidia GeForce GTX 970 And 980 Review: Maximum Maxwell

Page 14 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


I normally would too, but my two 580s have served me well for a while now, and the noise & power
consumption are starting to bug me. 😀 Plus, I've managed to find a buyer for one of them for 175 UKP
which helps offset the cost of a 980.

Re 560 Ti SLI vs. 580; I had the former setup for a while. I upgraded precisely because the 560 Ti's 1GB
just wasn't enough to handle Crysis the way I wanted to play it. 1.5GB 580 SLI helped quite a bit, but then I
switched to 2x 3GB 580 (sold the 1.5GB cards for a small profit) and that fixed the VRAM issue completely. If
VRAM capacity isn't a factor then sure the value option is obvious, but as I say some of the games I like to
play stutter too much with multiple GPUs, Stalker inparticular (maybe it's something to do with games that
use deferred rendering, who knows).

Ian.

 
My son's 560 Tis are 2 GB ...

http://www.asus.com/Graphics_Cards/ENGTX560_Ti_DC2_TOP2DI2GD5/

nothing has yet shown increased performance above 2 GB at 1920 res.

http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/

I played all the STALKERs just fine on twin 780s.... my son recommended them after playing on his twin 560 Tis. STALKER likes faster RAM and PCIE bandwidth I noticed.
 
JackNaylorPE writes:
> My son's 560 Tis are 2 GB ...

Ah I see! In that case it'll be much better. Quite unusual to have 2GB 560 Tis, most were 1GB.


> nothing has yet shown increased performance above 2 GB at 1920 res.

Dunno where you get that from. Modded Skyrim easily blows past 2GB at HD, and I certainly
noticed a difference with Crysis, but than as I say that's because I've gone way beyond just
maxing out the normal in-game options. I've tweaked the sucker to shove the LOD distances
waaay out, etc., so for example enemies and objects can be seen much further away, ditto
vegetation, shadows, etc. Not that surprising to find most games' standard settings won't
gobble more than 2GB, but it's wrong to say it's not useful. As soon as one starts customising
a game, more can definitely help (and why anyone would play a game like Crysis and not
customise it is beyond me, the results are very much worth the effort).


> the STALKERs just fine on twin 780s....

I'm not surprised. 😀 I get good frame rates just with two 580s. Two 780s would be sweet...


> ... STALKER likes faster RAM and PCIE bandwidth I noticed.

Hmm, interesting, not something I've tested so far, though my RAM is at 2133 which should be ok.

Ian.


 
I agree with Ian about the Skyrim part , My older GTX 560ti 1gb struggled in moded skyrim however in the normal run of the game without mods it seemed to do perfectly fine with the occasional stutters and lag sometimes when I entered somewhere new.
 
Of course though there are plenty of games which don't gobble as much RAM as Skyrim does when it's been
modded. ;D But unless there's a nasty price difference, more is better IMO, as I do tend to mod all the games
I play if possible (the only game I've played so far that was lacking in that dept. has been Far Cry 2 which isn't
as customisable as Crysis). Pushing Crysis to the limit was a blast, and I was getting good results even before
deciding to crank the custom settings totally up the wazoo (I really wanted enemies, objects & vegetation to be
visible at a long range). IIRC I tried something related to long distance vegetation shadows which strained even
3GB VRAM, so I had to dial it back, something about a 2K vs. 4K map or somesuch, can't recall now.

Ian.

PS. TopLuca, do you follow the Skyrim OCN best-pics thread? Lots of talk there sometimes about system
specs for modded Skyim.

 
@Ian , Unfortunately I don't follow that thread but I'll make sure to check it out as soon as possible . College and the office eats up most of my time but If you say its worthwhile then I'll definitely check it out , so thanks Ian 😀
 


I gave the link....here it is again:

http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/

You have to make a very clear distinction here.... never said that games wouldn't "claim or allocate" more RAM if it is sitting there doing nothing. What I am saying is there has never been a published test result where you put in a 4GB, then yank it out an out in a 2Gb card where there was any significant drop in frame rates or any other observable impact on performance. Pay specific attention to the last 2 sentences.

Let’s only look at the games where there is more than a single FPS difference between the two GTX 770s [one 3 GB and one 4 GB].

Metro: 2033 is completely unplayable on either card at our highest resolution [of 5760 x 1080], and even GTX 770 4GB SLI wouldn’t be playable either. Sleeping Dogs has a problem actually displaying on the outer LCDs although the performance is cut, so this benchmark has to be discounted. This leaves five games out of 30 where a 4GB GTX 770 gives more than a 1 frame per second difference over the 2GB-equipped GTX 770. And one of them, Metro: Last Light still isn’t even quite a single frame difference. [Many games were also slower in 4GB]....

There is one last thing to note with Max Payne 3: It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it claims to require 2750MB. However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting. And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s.


 


The whole idea of publishing comparative frame rates for this is stupid.
There isn't going to be some small increase in performance when comparing a card with enough memory to one that does not have enough.
When there is not enough available VRAM, performance deteriorates sharply to the point where it is unusable.
There is ample information available to show this occurs with modded Skyrim or Watch Dogs with the highest settings at 1920x1080 on a 2GB card.
 
Why would you test a Reference R9 290/X is beyond me. To add to insult the 970 is cherry picked from EVGA. Another biased test review from Tom's. I'm sorry I clicked to read.
 


The GTX 970 card was down clocked to reference clocks and benchmarked against reference cards from AMD. Seems like a balanced comparison to me.
 


Forget frame rates..... concentrate on what was underlined.

However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting. And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s.

No performance impacts whatsoever





 


Can you provide frame variance please and timings ? That would help a lot in identifying the situation.
 


In the specific games with the specific settings they tested, they observed no performance impact. Obviously these games with these settings do not require more than 2GB of VRAM, hence adding extra VRAM will have no performance increase at all. In a game where more than 2GB is required the performance impact is severe.
 


Read it again.

1. The game in question DOES require more than 2 GB

There is one last thing to note with Max Payne 3: It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it claims to require 2750MB. However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting. And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s.

2. They are using 5760 x 1080 not 1920 x 1080 ....that's 3 times 1920 x 1080.... what game are you referring to that needs 3 times the 2750 MB of max payne ?
 


2750MB is recommended in Max Payne for those settings. This is not to say it is necessarily required.
They found in the specific benchmark they were running that 2GB was enough.

Memory usage cannot be multiplied based on the number of pixels. The same textures are required no matter what resolution is being run. If you compare VRAM usage for 4K vs 1080p, you will find it is not x4.

I had already mentioned two specific examples in my earlier post, Skyrim with mods and Watch Dogs with highest detail textures.
I haven't played Watch Dogs. This comes from a guide published by Nvidia:
http://www.geforce.com/whats-new/guides/watch-dogs-graphics-performance-and-tweaking-guide#textures
I've seen the issues in Skyrim myself on my GTX 770 2GB card and there are plenty of forums around the web with people trying to diagnose similar issues. The very high resolution textures included with some mods simply require more VRAM.
 


So when a manufacturer says 4 GB is required it's used to support the argument for more memory and its valid but when someones tests the validity of that statement and proves it erroneous, it's not ?

It is a well known and very basic fact that higher resolutions require larger memory amounts..... if we can't agree on this, it's going to be hard to have a relevant discussion. Try running your Skyrim Mods at 640 x 480. The GFX card controls pixels and those textures are made up of pixes, less pixels = less work.

Even in the old days when it was a bit simpler:

Memory Req'd = Resolution x color depth / 8
 


The benchmarks you linked don't prove that 2750MB is not required to run the game at that resolution, simply that in the specific benchmark they ran they did not observe any performance loss. The developers suggest that 2750MB is required and I wouldn't bother trying to run the game at that resolution with less. You could easily encounter some part of the game that requires more memory than was needed in that specific benchmark.

The Nvidia article on the other hand shows performance degradation when less than 3GB of memory is used. If you could find some specific part of the game which was OK with less than 3GB, that would not mean that less than 3GB was suddenly ok.

I haven't said that higher resolutions do not require more memory, but that the memory usage is not a multiple of the number of pixels.

This was never meant to be an in depth discussion and it is not the topic of the thread. You made the statement "nothing has yet shown increased performance above 2 GB at 1920 res" and I have given you two specific examples where this is not true.
 
And not a word about the amperage requirements on the 12V rail..
While people are burning these cards because their PSUs don't suffice..
 

?

Who is burning these cards? They require far less power than even last-gen GPUs.

 


Yea, they use just over half the power of my HD 7970, while performing far better.
 
Status
Not open for further replies.