Most of the cards sold at sites like new egg are already overclocked when they leave the factory. Non-reference cards have changes ranging from just a better cooler (EVGA CS Series) to custom PCbs w/ improved components (VRMs, chokes, etc.). Then there's those special cards like (MSI lightning) which take the non-reference mindset to the extreme.
Which card you get will determine just how fast each card is. For the 9xx series, I prefer the MSI cards... Gigabyte is slightly faster but overall, seems to have a high disatisfaction rate from newegg users. Review wise, you can look at TPU
Gigabyte - 9.5
http://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_G1_Gaming/35.html
MSI - 9.6
http://www.techpowerup.com/reviews/MSI/GTX_980_Ti_Gaming/35.html
EVGA - 9.6
http://www.techpowerup.com/reviews/EVGA/GTX_980_Ti_SC_Plus/35.html
Asus - 9.6
https://www.techpowerup.com/reviews/ASUS/GTX_980_Ti_STRIX_Gaming/35.html
And yes, I have experienced similar situations of having to cut card speed in certain games. My son uses a special set of Afterburner settings for BF4 and I set Afterburner to the factory OC'd settings to plat Witcher 3 (normally it has a 26% OC in all other games) on my SLI Asus DCII 780s (water cooled). The twin 970 build (MSI 970s, air cooled) does much better but still has to be turned down for BF4 and W3, but just a bit, still well over 20% OC.
As for Batman running out of RAM, I don't see it. No test site to date has yet to find a RAM deficiency at up to 1440p. Misinformed users have made the mistake of concluding that just because GPU-z reports RAM usage at a certain level, that this is somehow significant. It's not, it has no bearing on how much is actually needed to prevent deterioration in performance (fps), playability (stuttering) or image quality.
This argument has been put forth for years but never been duplicated in actual testing. It 1st started to gain traction as "2 vs 4 GB" with the 7xx series where they installed max Payne and it would not allow setting 5760 x 1080 with 2GB. But when they tricked it by installing a 4GB card, taking it out and swapping it out for a 2 GB card, it played at the same fps and with the same lack of issues.
http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/
This is because a game's install program sees how much RAM is available and says "Oh cool .... it has 8GB, let me set aside 5 GB for my usage", that has nothing to do with what the game actually needs.
More here:
http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,12.html
https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/
But Extremetech did one of the better investigations which you can find here:
http://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x
GPU-Z claims to report how much VRAM the GPU actually uses, but there’s a significant caveat to this metric. GPU-Z doesn’t actually report how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, whether it’s GPU-Z, Afterburner, Precision, etc. They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.”
The article had a hard time finding games that could break that 4 GB barrier and BAA was not among them
When we started this process, I assumed that a number of high-end titles could readily be provoked into using more than 4GB of VRAM. In reality, this proved a tough nut to crack. Plenty of titles top out around 4GB, but most don’t exceed it. Given the lack of precision in VRAM testing, we needed games that could unambiguously break the 4GB limit.
We tested Assassin’s Creed Unity, Battlefield 4, BioShock Infinite, Civilization: Beyond Earth, Company of Heroes 2, Crysis 3, Dragon Age: Inquisition, The Evil Within, Far Cry 4, Grand Theft Auto V, Metro Last Light (original), Rome: Total War 2, Shadow of Mordor, Tomb Raider, and The Witcher 3: Wild Hunt. Out of those 15 titles, just four of them could be coaxed into significantly exceeding the 4GB limit: Shadow of Mordor, Assassin’s Creed: Unity, Far Cry 4, and Grand Theft Auto V. Even in these games, we had to use extremely high detail settings to ensure that the GPUs would regularly report well over 4GB of RAM in use.
You can read the graphs in the article, but Far Cry 4 couldn't break 4 GB @ 1440p, even the disastrously coded port for ACU couldn't break 4 GB, neither did GTAV or SoM. They had to go to 4k @ high detail to break 4 GB and at those settings, the game was unplayable.
While we do see some evidence of a 4GB barrier on AMD cards that the NV hardware does not experience, provoking this problem in current-generation titles required us to use settings [4k] that rendered the games unplayable any current GPU.....
The most we can say of a specific 4GB issue at 4K is that gamers who want to play at 4K will have to do some fine-tuning to keep frame rates and resolutions balanced,.... but that’s not grounds for declaring 4GB an unsuitable amount of VRAM in today’s games.
In short, at 1080, I don't see additional VRAM as addressing your issue.