Asus Gtx 780 ti vs Gtx 970 g1 gaming

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Solution
The GTX 780 Ti is faster than the GTX 970.
The GTX 970 has more VRAM, but there is no game that currently requires more than this at 1920x1080.

While you find people claiming Battlefield 4 requires more than 3GB, this is only with mantle.
You can see the GTX 780 Ti with 3GB VRAM out-performing the AMD 4GB cards with mantle in this benchmark:
http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-8.html

Watch Dogs also requires at least 3GB of VRAM for ultra textures:
http://www.geforce.com/whats-new/guides/watch-dogs-graphics-performance-and-tweaking-guide#textures

There are also two new games coming out that recommend more than this, but they haven't been released yet so the material may not be correct...


Can gtx 970 run crysis 3 , watch dogs on 8msaa ? I saw gtx 780 ti running that on 1080p . I have Corsair vs 650 for it . Thanks
 
The GTX 780 Ti is faster than the GTX 970.
The GTX 970 has more VRAM, but there is no game that currently requires more than this at 1920x1080.

While you find people claiming Battlefield 4 requires more than 3GB, this is only with mantle.
You can see the GTX 780 Ti with 3GB VRAM out-performing the AMD 4GB cards with mantle in this benchmark:
http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-8.html

Watch Dogs also requires at least 3GB of VRAM for ultra textures:
http://www.geforce.com/whats-new/guides/watch-dogs-graphics-performance-and-tweaking-guide#textures

There are also two new games coming out that recommend more than this, but they haven't been released yet so the material may not be correct:
http://www.tomshardware.com/news/evil-within-gold-bethesda-horror,27765.html
http://www.overclock.net/t/1515461/ipon-shadow-of-mordor-6gb-of-vram-for-ultra-textures

Note that at 1080p, the value of these very high resolutions textures is questionable.

I think the GTX 780 Ti is a great card and more than enough for 1080p with a 60 Hz monitor.
While the GTX 970 is about 5% slower at stock, it has 4GB of VRAM and support for some new features.
Without over clocking yourself, most cards are factory over clocked anyway, and they can likely get more of an increase from the GTX 970 than the GTX 780 Ti because of the chip design.
Personally, I would choose the GTX 970.
The only older card I would consider in favour of this would be the GTX 780 6GB version as I think the extra VRAM makes it even more future proof.
 
Solution
Above poster makes great points. Good info. I totally agree 970 or 6gb 780

My opinion, VRam makes cards obsolete the quickest. Higher resolutions become affordable quickly,games become more sophisticated, textures higher and all require note VRAM. Every generation of cards people argue the cards have enough memory, but just 3 years ago, high end cards only had 1-1.5 GB on them.
 


Thanks alot sir , so awesome and helpful points
 
I would vote for the GTX 780Ti, VincentP had pretty much summed up all i would say if otherwise.

NOTE: Don't fall into these extra Vram traps, there is no documented, rock solid reviews on the higher Vram'ed cards actually being worth it over the based modelled cards (SLI I I can understand not for just one card). Currently arguing the same thing over here but some users don't get the point -

http://www.tomshardware.com/answers/id-2316461/game-1360-768-worth-upgrading-280x-970gtx.html#14276096
 


The problem with VRAM is that it is an abrupt limit and you can't upgrade it.
If a GPU isn't quite fast enough, performance drops a little.
If you don't have quite enough VRAM, performance degrades immediately. The intermittent drops to near 0 FPS are far worse than a constant 10 FPS drop.
Conversely, extra performance always helps but more memory than you need doesn't help performance at all.

I learn't this the hard way, buying a GTX 770 when they were first released. Anywhere you cared to look, 2GB of VRAM was enough for 1080p. I quickly discovered though that Skyrim with mods could exceed this. Now Watch Dogs is out and requires at least 3GB for ultra textures. You can always scale back the settings, but this is disappointing when the GPU is fast enough and all you need is more VRAM. To make matters worse, I bought a 2560x1440 monitor.

The same can be said of system memory, although at least you can upgrade this. 8GB is enough for any game available now. When building a system to last say 5 years though (which should be fine for CPU, motherboard and RAM), you have to think what requirements will be in 4 years.

The point is, when buying a graphics card don't look at how much VRAM you need today. Look at how much you will need in two years time.


 


That is fine in those specific games.
As I wrote earlier, adding more VRAM when you have enough will not help performance at all.

Increasing the texture size a game uses puts no additional load on the GPU at all, but does require more VRAM.
Increasing the level of anti-aliasing has an effect on both.
Lighting effects and other computationally intensive tasks increase GPU load but do not require more VRAM.
This means that the required VRAM varies between programs and isn't directly related to how fast the GPU is.
 
There are no current reviews that support that at all, as accordance to everyone on side with the vram theory the higher resolution in that review (5760x1080) "supposedly" are supposed to take advantage of the higher vramed model, though it would appear not to be. I have not come across a review in which supports the higher vramed models really offer a beneficial point in going for them. All the "recommended" games that list cards like the GXT 660 3GB, 7870 3GB and GTX 760 4GB is rubbish, looking at the game review with the stock standard base differentials, it is next to nothing. For example, The Need for Speed rivals with recommends either a 3GB GTX 660 or 3GB 7870.
 


You are comparing current games which don't need the extra VRAM and pointing out it doesn't increase performance.
As I've written twice already, adding more VRAM when you have enough will not help performance at all.

What you are betting on though is that VRAM requirements will not increase at all in the next two years (or however long you plan to keep the card).
As I've tried to explain, VRAM requirements can increase independent of raw processing power.
You will see no benefit from the extra memory today, but there will almost certainly be a benefit in the near future.

Nvidia has replaced the GTX 770 with the GTX 970. These were at the same price point when they were released and the GTX 970 will sit in roughly the same place within the GTX 900 series line up as the GTX 770 sat in the GTX 700 series line up. They doubled the VRAM on this card and an 8 GB model is expected to replace the old GTX 770 4 GB model. This extra memory comes at a cost and they would not have done this if it were not necessary.
 
"almost certainly be a benefit in the near future."

First up, nobody can predict the "near future". I also like your use of the word 'almost', shows your not entirely sided with yourself either.

"What you are betting on though is that VRAM requirements will not increase at all in the next two years"

Secondly, I have not stated this, I have pointed out is the as of the past year, this year and upcoming games haven't proven anything that the higher Vramed models of the same card are beneficial. There is a very very big difference between needing the 2GB of VRAM and buffering the extra 2GB VRAM.

Just because its storing more data, does not mean it needs to do it to run optimally.

Thirdly, Link something that can make sense to all of us, words mean one thing but actually backing it up is another. I will be the first to admit when i'm wrong but with no superficial backup the argument presented is a highway to nowhere. People like you and many are basing information upon worded information, without hardcore proof that it is even necessary as of yet.
 
Let me re-phrase then. With no inside knowledge of what is currently under development I am as confident as I could be that VRAM requirements will increase in the next two years and I wouldn't personally today buy a card with less than 4GB of VRAM with an expectation that it will support the highest texture settings for games released in the next two years.

I'm not supporting the argument of whatever article you are referring to that claims their current game is benefiting from having 4GB of VRAM. I haven't read it and I don't care. There is no benefit that I am aware of in having more than 3 GB in any currently available game at 1920x1080 resolution.
 


This is really the crux of your issue. You are focussed on what is required today and not attempting to account for what you will need in the lifetime of a product.

We don't build a power station, a school, an airport, a bridge or a road based on what we need today. We build it based on what we expect will be required before it is due to be upgraded or replaced. Naturally you can't have definitive numbers for any of this, but you can make an educated guess.

When you write "without hardcore proof that it is even necessary as of yet", you illustrate the problem. It is not required yet, but it is reasonable to expect that it will be useful in the time you expect to keep the card.
 
Before the lifetime of the lifetime occurs, it will be severely outdated. It has taken 8 years for vram 512MB of memory usage to just under 2GB (1.8GB it was). Users are now expected Vram usage to double in just a matter of 2 years?

Only reasoning people have come to understand that Vram is more beneficial is by user reports, along with all the game recommendations that people believe the "GTX 660 3GB, 7870 3GB and GTX 760 4GB" are actually needed.

Without a doubt later on (not in the next 2 years) these 4GB cards will come in handy, especially when 4k gaming becomes mainstream, but as for now goes, there isn't much logical and documented fact to do so. By the time you mention supposedly comes when these card will shine, they will already be outdated, and not sufficient enough but that isn't due to it's Vram amount.

With your lovely examples of what you listed about the economy, you missed one crucial thing and that is this -
The near future isn't expected to have double the capacity in a matter of 2 years, but a faster increase then normal over a longer period of time (and that time is 2 years).
 


These are your examples, not mine. I am no way using this to support my argument. Users will make many unfounded claims, and you will see plenty of this on these forums.



I am writing purely of 1920 x 1080 resolution.

My personal experience is that with HD texture mods and anti-aliasing enabled, Skyrim will exceed the 2 GB of VRAM on my card and suffer severe frame rate drops entering areas where new textures must be loaded. I can see the peak in VRAM corresponding to the drop and reducing the size of textures or anti-aliasing resolves the issue. In this resolved state I can see VRAM usage approaching but not reaching the 2 GB ceiling.

I base the Watch Dogs figure on a guide published by Nvidia for suggested game settings on their own cards:
http://www.geforce.com/whats-new/guides/watch-dogs-graphics-performance-and-tweaking-guide#textures

Based on these I state that 3 GB of VRAM is required today to run the highest available settings at 1920 x 1080 resolution in some games.



These examples show planning for the useful life of the product or infrastructure. The time scales and rate of increase vary of course.
The high end cards being released 8 years ago had 768 MB of VRAM, so a required amount of 512 MB doesn't seem unreasonable.
Lets say that the 2 GB requirement for highest settings was reached two years ago with the release of Far Cry 3.
That is a multiple of x4 in 6 years, which means doubling every three years.
If this trend continues you would need 6 GB of VRAM in three years from now for the highest settings. This seems pretty reasonable to me.
It would be fair to say that in three years time these cards won't be fast enough to run the highest settings anyway, but the trend also suggests the VRAM requirements for the highest settings will reach 4 GB by the end of next year (3 years from the release of Far Cry 3). This to me makes 4 GB of VRAM a minimum for these high end cards, and Nvidia seems to have picked the same number.

I'm glad to have a discussion with you about this. There are a lot of ill-informed statements on the forums and if you can bring reasoning to some of those threads it may help someone. I'll be even happier if I convince you that buying a card with 3 GB of VRAM today and expecting it to run the highest game settings for the next two years is unlikely because this could save someone else disappointment.
 


You may need 6gb of vram in 3 years if the current trend continues, but the architecture will be old and the GPUs underpowered to even attempt highest settings on games of that generation. Its like saying a 480 and a 580 are still relevant today. Yeah, theyre pretty capable cards, but they struggle at highest settings.
 
VincentP I am quite impressed with the argument you have presented, though if i were to rebut what you have said I would only be repeating myself, which is not want I want to do. Your argument is solid, and I would also consider mine to be but we cannot both either go one sided because both of our sides have flaws. The information the we both need we both don't have, and that is hardcore proof that these cards actually outperform in these modern day games. I would like to go further, but without this information it would simply be pointless as there hasn't been such a review to date that I have personally seen which can support my side of the argument. If you wish to argue this topic further, feel free to do so in PM, though I am keen to see what the difference is myself when a review comes out in the near future.

All the best, Unknownofprob.
 


If you are going to respond, please read the post first.

 
 
 
Well I don't know all the stuff you guys do but Im running a ROG Maximus extreme with 3 asus 780 ti sli and game like crisis 3 and anything else I play is pure awesome I7 3770K nothing overclocked so all I can say is until I see solid evidence of a card strait up beating the asus 780ti it is a no brainer when I had 1 780 ti and bought a 980, there was no differenc to make me like the 980 so I sent it back and bought 2 more asus 780 ti, My wife was mad wanting to know how much I paid to set up 3d on my pc lol when she watched me playing on my Samsung 27" monitor. Just my 2 cents
 
I know it's an old thread but I'd like to say, a Gigabyte Windforce 780Ti OC 3x vs a Gigabyte Windforce G1 Gaming SOC 970, the 780Ti comes out on top 99% of the time, even at 4k and if you oc the 970 to its limit, the 780Ti is faster. In games at 4k games use more vram if it's available, I've noticed, yet with the 780Ti they use less vram - go figure. Crysis 3 has the 780Ti streaking ahead of the 970 at 4k, the same with Metro Last Light. Project Cars, they are similar with the 780Ti being 3-5fps ahead. Both chuck COD:AW over 60fps at 4k with the Ti hitting over 10 fps more than the 970. I have both and also both in sli. I've not benched 1080p as it is a dead resolution in my eye's. I've also not benched using AA as that is pointless with 4k pixel density, but all other settings are at their maximum. Also, the 780Ti will output 60fps via HDMI and at load their wattage use is very similar. TDP is thermal output not power use. Both remain reasonably cool in SLI with the 970 winning by 8c.

The only reason to buy 970's over 780Ti's is for possibly better DirectX 12 support later on down the road, otherwise I'd easily go with the 780Ti over the 970. Believe me, the 500Mb of vram at present makes no difference in performance and for DirectX 12 games of the future, we've been lead to believe that SLI vram will stack so, no real vram issues there either. Second hand a 780Ti is upto £100 cheaper than a 970!