GTA V frame drops with a GTX 970

SkinheadRed

Reputable
Dec 31, 2015
16
0
4,510
I recently got a new computer, and everything was working just fine. I was playing GTA V almost on max settings at 60fps. All of a sudden my frames started to drop to 30fps and then to 11fps for no apparent reason. I've tried other games to see if they were also being affected, but the Witcher 3 still ran fine, and so did Middle Earth Shadow of Mordor. Is their anything that could have caused the sudden frame drop?

P.S. Here is my GPU usage.
http://s1376.photobucket.com/user/falloutgd/media/GPU_zpseptrcyqp.gif.html?sort=3&o=0
 
Solution
The problem is you don't use more than 3.5 GB in anything at 1080 and 1440 when doing anything "normal" .... by the time you up the settings reach a point where VRAM becomes an issue, your fps is already at an unsatisfactory level. So what if you run out of VRAM at settings at which your GPU is delivering only 23 fps.

From above link. The take home message is in bold below.

http://www.guru3d.com/news-story/middle-earth-shadow-of-mordor-geforce-gtx-970-vram-stress-test.html

Thing is, the quantifying fact is that nobody really has massive issues, dozens and dozens of media have tested the card with in-depth reviews like the ones here on my site. Replicating the stutters and stuff you see in some of the video's, well to date I...
What's your CPU? GTA V is notorious for hitting CPU bottlenecks. Another very possible problem, judging by your VRAM usage (at 3433 MB), is that you're occasionally using just over 3.5GB, and thus going into the remaining 512MB of VRAM that is ~20x slower than the rest (the GTX 970 is technically only a 3.5 GB card that fully meets the advertised specs).
http://www.pcgamer.com/why-nvidias-gtx-970-slows-down-using-more-than-35gb-vram/
http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Discloses-Full-Memory-Structure-and-Limitations-GTX-970
 
1. Have systems with twin 780s, twin 970s and twin 980 Ti's .... every one of them has the same issues with GTAV. The problems remain whether SLI is on or off, whether at stock or OC settings on CPU and GPU

2. Pay no attention to the memory usage reported by GPU_z. It's simply wrong. There's absolutely no issue with regard to insufficient RAM at 1080p or even 1440p. It gets old retyping the explanation but I'll just use this

http://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x

GPU-Z: An imperfect tool

GPU-Z claims to report how much VRAM the GPU actually uses, but there’s a significant caveat to this metric. GPU-Z doesn’t actually report how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, whether it’s GPU-Z, Afterburner, Precision, etc. They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.”

As for GTAV

GTAV-RAM1.png


Now remember that is the amount of RAM requested based upon what it saw was available, not the amount actually used. Think of it like this .... you apply for a credit card and based upon your financial info, the CC company gives you a limit of $5,000 to draw from, that doesn't mean you have to use it or will ever use it.

The best test of this scenario was alienbabeltech's test of Max Payne on 2GB and 4 GB 770s at 5760 x 1080. The game would not allow the 5760 x 1080 res to be selected with the 2Gb card installed, so they took it out, put in the 4GB and ran benchies.... Then they put the 2GB back in and reran the same benchies.... after fooling the install program that 4 GB was there, the game ran at the same fps, at the same quality with no observable impact on performance. The site is down but you can see the results of tests in about 40 games here.

https://www.youtube.com/watch?v=o_fBCvFXi0g

Other tests including the non-impact of the 3.5 here:

https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/
http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,12.html
http://www.guru3d.com/news-story/middle-earth-shadow-of-mordor-geforce-gtx-970-vram-stress-test.html

We see in your image that the performance is being capped by UItilization, not temps, voltage or power.

vRel = Reliability. Indicating perf is limited by reliability voltage.
VOp = Operating. Indicating perf is limited by max operating voltage.
Pwr = Power. Indicating perf is limited by total power limit.
Thrm = Thermal. Indicating perf is limited by temperature limit.
Util = Utilization. Indicating perf is limited by GPU utilization.
 

Yeah, the take home message is that if you're not actually using more than 3.5GB of VRAM you won't see any performance issues related to VRAM useage (anyone could have told you that without doing any testing whatsoever!). When you do need more than 2 GB (or 3.5 GB etc) however... (especially see 960 2GB vs 960 4GB performance):
http://www.gamersnexus.net/game-bench/2195-assassins-creed-syndicate-gpu-fps-benchmarks

 
The problem is you don't use more than 3.5 GB in anything at 1080 and 1440 when doing anything "normal" .... by the time you up the settings reach a point where VRAM becomes an issue, your fps is already at an unsatisfactory level. So what if you run out of VRAM at settings at which your GPU is delivering only 23 fps.

From above link. The take home message is in bold below.

http://www.guru3d.com/news-story/middle-earth-shadow-of-mordor-geforce-gtx-970-vram-stress-test.html

Thing is, the quantifying fact is that nobody really has massive issues, dozens and dozens of media have tested the card with in-depth reviews like the ones here on my site. Replicating the stutters and stuff you see in some of the video's, well to date I have not been able to reproduce them unless you do crazy stuff, and I've been on this all weekend.

At 2560x1440 I tried filling that graphics memory, but most games simply do not use more than 1.5 to 3 GB at that resolution combined with the very best image quality settings. This includes MSAA levels of up-to 8x. At the best settings and WHQD we tried, Alien Isolation, Alan Wake, BioShock Infinite, Hitman, Absolution, Metro Last Light, Thief, Tomb Raider, Asassin’s Creed Black Flag.

Can you create a condition where this occurs, of course you can ... like those youtube videos where people have Windows autostart 30 programs to show what a big wow their SSD is .... or opening their browser with 25 tabs ... it just doesn't affect what many of us like to call "real life". As the author goes on to say"

So if you start looking at that resolution and zoom in, then of course you are bound to run into performance issues, but so does the GTX 980. These cards are still too weak for such a resolution combined with proper image quality settings. Remember, Ultra HD = 4x 1080P. Let me quote myself from my GTX 970 conclusions “it is a little beast for Full HD and WHQD gaming combined with the best image quality settings”, and within that context I really think it is valid to stick to a maximum of 2560x1440 as 1080P and 1440P are is the real domain for these cards. Face it, if you planned to game at Ultra HD, you would not buy a GeForce GTX 970.

Well the "so does the 980" part sure blows the hell outta the 3.5 GB argument.

So the two titles that do pass (without any tricks) 3.5 GB are Call of Duty Advanced Warfare and of course that has been most reported to stutter is Middle Earth: Shadow of Mordor. We measured, played and fragged with COD, and there is just NOTHING to detect with the graphics memory fully loaded and in use

As for GamersNexus, I'm not gonna click it but guaranteed it's refers to an Assassin's Creed title. Unity had so many problems and was so "broken" that they offered all users free DLCs as a way of apology.

Read up on the implications behind the problems with the AC series. And I always suspect performance tests in games whose engine has a built in performance cap.

http://www.pcinvasion.com/assassins-creed-unity-pc-port-impressions-minimum-specs-edition

Syndicate has gotten similar complaints. And, more importantly, while to 2GB / 4GB testing was applicable to the games being played when the 7xx series is out, no one has suggested that 2GB is enough for the 9xx series @ ultra settings using the current games that have come out since it's release. Trying to use a 960 on a demanding game with ultra settings is asking a bit to much

Again, as I said above ... well extremetech said it better in the link above and they have 5 pages of data explaining why this question has been put to bed.

We began this article with a simple question: “Is 4GB of RAM enough for a high-end GPU?” The answer, after all, applies to more than just the Fury X — Nvidia’s GTX 970 and 980 both sell with 4GB of RAM, as do multiple AMD cards and the cheaper R9 Fury. Based our results, I would say that the answer is yes — but the situation is more complex than we first envisioned.

First, there’s the fact that out of the fifteen games we tested, only four of could be forced to consume more than the 4GB of RAM. In every case, we had to use high-end settings at 4K to accomplish this.

While we do see some evidence of a 4GB barrier on AMD cards that the NV hardware does not experience, provoking this problem in current-generation titles required us to use settings that rendered the games unplayable any current GPU.

The most we can say of a specific 4GB issue at 4K is that gamers who want to play at 4K will have to do some fine-tuning to keep frame rates and resolutions balanced, but that’s not unique to any vendor. If you’re a gamer who wants 4K and ultra-high quality visual settings, none of the current GPUs on the market are going to suit you. HBM2 and 14/16nm GPUs may change that, but for now playing in 4K is intrinsically a balancing act. The Fury X may require a bit more fine-tuning than the 980 Ti or Titan X, but that’s not grounds for declaring 4GB an unsuitable amount of VRAM in today’s games.

The games used by the way were Assassin’s Creed Unity, Battlefield 4, BioShock Infinite, Civilization: Beyond Earth, Company of Heroes 2, Crysis 3, Dragon Age: Inquisition, The Evil Within, Far Cry 4, Grand Theft Auto V, Metro Last Light (original), Rome: Total War 2, Shadow of Mordor, Tomb Raider, and The Witcher 3: Wild Hunt.

If that isn't enough, as indicated above, we are experiencing the same problems with 3 GB 780s, 4 GB 970s and 6GB 980 Ti's. Obviously, VRAM is not the issue with this title.

 
Solution