Nvidia GeForce GTX 980 Ti 6GB Review

Page 10 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

mapesdhs

Distinguished


I guess you're not familiar with the chip he has; all 3930Ks will do at least 4.6, assuming he has a C2.

HW is heat crippled by Intel, just like IB. That's why I keep buying SB/SB-E, no crippling, 5GHz on a 2700K is easy, 4.7+ on a 3930K is easy. IB/HW have better IPC, but without delidding they're harder to oc, even with water. Not quite so bad with HW, but terrible with IB. I've dealt with more than half a dozen SB-E setups so far, I'd be embarassed if I sent one out at only 4.4 or lower. Atm I'm sorting out a further two 3930K systems and two 3970Xs, the former at 4.7/4.8, the latter both at 4.8 (all could go higher, but I err just a tad on the lighter side of vcore, etc.)

Looking back at his specs though, could be his mbd just can't hack it. Certainly not a board I'd use with a SB-E. I just bought a couple of ASUS refurb R4Es/P9X79-WS, and a barely used P9X79 Deluxe for a snip.

Ian.

PS. There's a UK eBay seller who has new 3930Ks listed atm for 219 UKP each (item 161718612276). Pretty good price IMO, though the last 3970X I bought cost less.

 

Madmaxneo

Reputable
Feb 25, 2014
335
2
4,810


Are you on air or liquid cooling? You will get better OC on liquid. If your using the auto OC settings then that is what is holding you back, the auto settings tend to push the vcore to high. Try lowering your vcore some....
 

mapesdhs

Distinguished


Yup, very true (I use H80/H100/H110, but with better fans for low noise), though for a 5GHz 2700K one doesn't need water cooling, just a simple TRUE and one good low-noise NDS PWM fan does the job just fine, though for most builds I use an H80 anyway to ensure more robust transport. Re vcore, there are some good OCN threads on all this, and Miahallen's oc guides are well worth reading. I've posted my settings on the OCN i7 4GHz+ owners club, typical template for a working 5.0/2700K combo.

For unmodded IB though, even water isn't enough to get a high oc. Delidding solves this, but not everyone would want to try that (some good tutorials on youtube though).

Ian.


 

kyotokid

Distinguished
Jan 26, 2010
246
0
18,680
I'm writing this as seriously as I can, not being a fanboy: What is the purpose of the Titan X at this point? It lost its DP performance that made it a fantastic workstation-gaming hybrid. Also, it really sucks for people who bought a Titan X just a little over a month ago? That's ~$350 down the drain pretty much. Yea the Titan X has all that extra VRAM, but for what? 3 4K displays maybe, at which point a 980ti SLI would probably lose by about ~5% due to a few less CUDA cores.

Again though, for most customers, the 980ti is the obvious choice. I just feel like nVidia totally screwed over most of their Titan X customers now. And why? Well, I really think the 980ti will be the cheaper answer to AMD's Fury or whatever Fiji will be called, Really interested to see how it will do. If Fiji beats the Titan X/980ti, it's rumored $800 price point would make the 980ti a somewhat compelling offer depending on how well it does.

In the end, I'm loving this competition!
I'm writing this as seriously as I can, not being a fanboy: What is the purpose of the Titan X at this point? It lost its DP performance that made it a fantastic workstation-gaming hybrid. Also, it really sucks for people who bought a Titan X just a little over a month ago? That's ~$350 down the drain pretty much. Yea the Titan X has all that extra VRAM, but for what? 3 4K displays maybe, at which point a 980ti SLI would probably lose by about ~5% due to a few less CUDA cores.

Again though, for most customers, the 980ti is the obvious choice. I just feel like nVidia totally screwed over most of their Titan X customers now. And why? Well, I really think the 980ti will be the cheaper answer to AMD's Fury or whatever Fiji will be called, Really interested to see how it will do. If Fiji beats the Titan X/980ti, it's rumored $800 price point would make the 980ti a somewhat compelling offer depending on how well it does.

In the end, I'm loving this competition!
 

kyotokid

Distinguished
Jan 26, 2010
246
0
18,680
..the Titan-X still has a purpose in life, GPU rendering. For GPU rendering in Iray (or any other PBR rendering engine) to work, the entire scene as well as all textures and effects must fit totally in video memory . If it doesn't the process will crash, or in the case of Iray, default to the CPU/physical memory which is much slower. For the level of scene complexity I create I can see that in may cases, 6 GB will not be enough.

So either I spend 650$ for a 40 - 45% chance the scene will not fit and the process dumps to the CPU or 1,000$ for maybe a 2% chance this could happen. In this scenario, I see the extra 350$ for the Titan-X as a wise investment, particularly since it matches up pretty well in performance and specification to the Quadro M6000 which retails for nearly five times the price.
 

uglyduckling81

Distinguished
Feb 24, 2011
719
0
19,060

No one is going to argue against the Titan X having a professional purpose. When you can write it off against your income it makes sense.
It has never been good value from a gaming point of view though. A 295x2 beats it in 4k by a significant margin in every title for half the price.
Nvidia's marketing is clever though as others have commented. Release the extremely over priced card (for gaming) first, then release a slightly lower performing card afterwards for what is still an incredibly high price, but it makes it seem good value because of the comparison to the initial release.
 

uglyduckling81

Distinguished
Feb 24, 2011
719
0
19,060


The $500 you save buying a higher performing 295x2 will pay for a lifetime of electricity to run it.
 

Vlad Rose

Reputable
Apr 7, 2014
732
0
5,160


I am using liquid cooling. I have already tried adjusting both the vcore and the ring bus in both directions (as well as about a dozen or other settings from guides I've read). The CPU does not have a temperature issue at 4.3ghz or higher in the bios, but instantly freezes as soon as the OS tries to boots. I even had the chip taken into a professional hobby shop to see if they could get it higher since I had been trying for over a month, same result; even with a different motherboard, RAM, PSU, and cooling setup on a test bench.

Intel did state that their k series does not guarantee a good overclock. In this case they are completely right.

PCPartPicker part list

CPU: Intel Core i5-4670K 3.4GHz Quad-Core Processor
CPU Cooler: Cooler Master Seidon 120M 86.2 CFM Liquid CPU Cooler
Motherboard: Gigabyte GA-Z87N-WIFI Mini ITX LGA1150 Motherboard
Memory: AMD Entertainment Edition 16GB (2 x 8GB) DDR3-1333 Memory
Power Supply: Silverstone Strider Gold 450W 80+ Gold Certified Fully-Modular SFX Power Supply
Case Fan: Corsair Air Series SP120 Quiet Edition (2-Pack) 37.9 CFM 120mm Fans

*** excerpt from https://pcpartpicker.com/b/DFzypg ***
 

Almighty Peanut

Reputable
Jun 4, 2015
1
0
4,510
I own 2 Titan X cards and feel like what Nvidia did was really messed up. I don't have to pay for my cards, so I'm not complaining about getting ripped off, but still. There was a huge gap between the 980 and the Titan X, so most assumed the 980 Ti would have fallen between the 980 and X, maybe be ~12% slower than the X, which would have been fine. Nvidia probably got scared by the R9 390X, especially if the 390X comes in for less than the 980 Ti, but has the performance of the Titan X, they would have shot themselves in the foot.

They did the same crap with the original Titan with the 780 coming in half the price and being like ~6% slower and the Ti being FASTER for ~$250 cheaper. I don't think Nvidia will sell very many Titan cards if they release another one with their next series of cards. People are going to KNOW that Nvidia is going to release an x80 Ti with near same performance for 2/3 the price.

With Nvidia taking away the FP64, it's not even much of a workstation cards. It's generally marketed as a ultra high end gaming card. Like I said, I don't pay for my hardware, but am still a bit outraged about what they did. I went from dual Titans to dual 980s to dual Titan Xs. I give my "old" cards to friends when I upgrade and/or put them in my guest machines/wife's computer.
 

smilemoar88

Reputable
Aug 22, 2014
12
0
4,510


Hey thanks for the reply. Yeah I haven't bothered with 3dMark runs yet but I have been watching the GPU stats in MSI.

I am finding that am pushing the close to the 4GB VRAM limit - typically 3.5GB- 3.9GB for the settings I enjoy. Which is why I was wondering about whether or not it would be worth switching to a 980 Ti. I'd ideally look at one to begin with and then, if DX12 lives up to some of the rumours, upgrade to 980Ti is SLI. Especially after reading MaximumPC's 980Ti sli review which showed significant benefits from 2 980Ti (http://www.maximumpc.com/nvidia-gtx-980-ti-2-way-sli-crushing-performance/). How good I guess is subject to change since other hardware would affect the outcome, but still.

The real hold back for me at present is that local stores here in Australia want to sell me the card for (no joke) $1000+ at which point I might as well get a Titan X for that money. I suspect the price will drop soon enough.
 

mapesdhs

Distinguished

3DMark can be useful, one just has to be careful about being fooled by the overall scores which are in many cases heavily skewed by main CPU power. Sadly, the vast majority of those who submit results do not bother to include info about their setup, and one can't rely on the auto-detected data to discern CPU clocks, etc. Thus, when comparing results, try to focus on the Graphics/Combined results, while understanding how they differ, and look for submissions which do have full details. In that regard they can be really useful for comparisons.

The further back you go through, the more CPU-skewed a test will be on newer hardware, so these days at the 980 level Firestrike minimum is relevant, more especially Extreme & Ultra. 3DMark11 can be fun for numbers but the default test is skewable. Further back than that and it just gets silly, down through Vantage and especially 3DMark06 which gains virtually not at all from anything better than a high-end card from 3 generations back (CPU speed dominates the test completely, mainly absolute clock as opposed to multiple cores).




In that case you may well be exactly the sort of user for whom such an upgrade is worthwhile, because though the on-paper performance of one 980 Ti is less than 780 Ti SLI, the RAM difference should mean much less stuttering and other issues by preventing games from reaching the RAM limit, given you are indeed reaching the 4GB limit.

These days it's good to see site reviews emphasising that for enjoyable & immersive gameplay, a lower sustained frame rate with higher minimums and less stuttering is definitely better than a higher average with lower minimums and more stuttering. Nothing worse than getting smooth motion inside a corridor or cave only to see choppyness ensue as soon as one emerges into the open air. I call it the Helicopter Effect after seeing part of a spectacularly bad B-movie on the syfy channel during which a helicopter shown crashing by turning end over end was animated with barely half a dozen separate frames for the entire sequence; it looked... indescribably terrible. Lovecraft needed to supply a suitable mental image! :|

Have you done any tests to establish the typical min/avg/max spread of the games you're playing, and the degree to which any stuttering occurs? My guess is you'd likely see the avg rate go down a chunk with one 980 Ti, but the minimums would go up and any stuttering should vanish almost completely.




Sounds like a sensible approach.




Thanks for the ref! That was interesting. One +ve though of course is the speedups from SLI for each game will get better with future driver updates.




In the UK the 980 Ti is at least 10% more than that, typically between 550 and 600 UKP (the equivalent of $1100 to $1200 OZ using market rates; real rates would be about 4% higher). The Titan X here, at 840+ UKP, equates to $1680 OZ @ market rates.

Ian.

 


http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/

Using and performing differently with less are two different things. Recognizing that today's games are more demanding to an extent....there was essentially no significant difference at all not so long ago even at 3 x 1080p.



 


VCCin is too low...boost that up to 1.9 volts and then start playing with Vcore. After reaching stability, you can lower VCCin again till you hit a failure point
 


Yea not too long ago 4GB made no difference. Of course it does depend on the game and we also have to consider that developers are starting to use things such as tessellation, which uses more VRAM, and higher resolution textures, that even if properly coded still take more VRAM.

GTA V is very demanding. I can run it mostly at 1080P with everything set to Very High and FXAA on a HD7970GHz but anything beyond those settings gives me some lag in some areas.
 

mapesdhs

Distinguished

a) Testing RAM sizes with just one card isn't the issue here. What matters is whether minimums & stuttering potentially caused by a mix of both SLI issues and draining RAM capacity mean that switching to a single newer card with a lower avg would result in smoother immersive play overall. This is a very likely scenario and it has been covered by toms articles in recent times. Don't take my word for it, site reviews have been focusing on it for a while now.

b) Making a judgement based on older tech and older games is hardly valid; what matters is the games/tasks the OP is using, not what's in a review, and this is where the use of things like modded setups makes a huge difference. I often find review tests are useless to me because I like to play with max detail, but a lot of reviews have an annoying habit of deliberately dialing down settings in order to reach 'playable' avgs so the card spread doesn't look so painful at higher resolutions, but this just ruins the appeal of higher res displays in the first place, hence why that maximumpc article is so interesting. The guy already said he's seeing almost 4GB used, so linking to that article (which didn't work btw) seems a bit odd. I don't know what games he plays, but I'm happy to take his word for it rather than a 3rd party article about an older single card and older tests.

Note that I haven't stated he would see what I've described, but from what he says it's very likely, though I wouldn't suggest anyone should make a purchasing decision until they've done more tests & research to confirm. A 980 is costly afterall, though as I said before he could cover most of the cost if he sold off the 780 Tis (and if he sold them to AE/CUDA users he could make a profit).

Ian.

 
a) That's just it, it hasn't

b) The judgement was made a long time ago.... well before that article was written which was the exact reason they did that article. It all turned out to be much ado about nothing then when the big clamor was for getting 4 GB instead of 2GB 770s..... the clamor then was baseless as the article not only showed at 1080p but at 3 x 1080p. At 5760 x 1080 you could ***use** more than 2 GB, but when using a 2GB, there were no slowdowns, no artifacts, no lost frames, no loss of detail, no lowered settings required.

We saw "Deja Vu all over again" with the 970 and the 3.5 Gb thing.... again "much ado about nothing"
http://www.guru3d.com/news-story/middle-earth-shadow-of-mordor-geforce-gtx-970-vram-stress-test.html

A lot of links posted in to THG lately don't seem to work....I dunno if it's a technical problem or perhaps someone doesn't like links from other sites being posted. If you have this problem try quoting the message and copy paste the url into your browser. I just tried that one tho and tho it worked when posted, not working now. The whole site is gone. try this and look at the cached version. Again THG will likely trim the link so you will have to quote the post and then copy paste

http://98.139.236.92/search/srpcache?p=gtx+770+4GB+vs+2gb+showdown&ei=UTF-8&hsimp=yhs-001&hspart=mozilla&fr=yhs-mozilla-001&u=http://cc.bingj.com/cache.aspx?q=gtx+770+4GB+vs+2gb+showdown&d=4527700127712613&mkt=en-US&setlang=en-US&w=71Dtu1w-alYK9fIFbHw6nYgnoM7XmeF7&icp=1&.intl=us&sig=FPwFBwTlDFF20WMx2OZjkA--

On page 3, they couldn't install Max Payne with a 2 GB card and the game **said** you needed 2750 MB. So they installed the 4 GB card and it installed and played fine. Then they swapped in the 2 GB card again and what happened ? .... the game played just fine ... same speed, same detail....same EVERYTHING. Which, again, illustrates the point ..... being able to use more than X GB is very, very, very different from suffering a negative effect of having less.

The guy already said he's seeing almost 4GB used, so linking to that article (which didn't work btw) seems a bit odd

Not odd at all. As the article so plainly stated.... being able to use 4GB is a very, very different thing that experiencing any negative impact for using less.

This clamor for more and more RAM has been going on for quite a while and didn't begin with the 9xx series. And while newer and more demanding games will certainly require more RAM, the subject was overblown then as it is now.
 

mapesdhs

Distinguished
In the end it's up to the OP to decide based on his own setup, the games he's playing, etc. I've seen plenty of posts from people who say having a larger RAM card did help (lots of them on the Skyrim pics thread), but of course it's not going to benefit everyone. Saying it won't benefit him though is wrong, you don't know that, so quoting links to imply that's the case makes no sense, it's trying to justify a position which hasn't yet been established. SLI setups by definition are not as smooth as a single card, never can be. You mentioned the article talks about Max Payne, but unless the OP is playing MP, who cares? What matters is what he's using the system for. Now you might indeed be right, but atm neither you or I can tell, we don't have his system; only he can do relevant tests & decide.

The 980 3.5 thing is another matter, indeed just a vast flood of FUD, as you say much ado about nothing (so +1 for saying so! :D), but that was more about NV haters and an unfortunate inter-dept. coms/PR screwup than any real issue. Oodles of people deliberately choose to believe it was all intentional (an illogical opinion IMO but there ya go), purely because they want that to be the case (1st Rule broken; Terry Goodkind rules!).

But it's wrong to say more RAM doesn't help period because it damn well can, eg. I could not get Crysis remotely playable the way I wanted it to be setup with 1.5GB 580 SLI; switching to 3GB 580 SLI made a huge difference (I like to crank everything up). Later, pushing it even more as I learned about the advanced customisation options, I hit a performance wall rather than a VRAM limit, so finally I moved to a 980 - that was more than twice the speed of 580 SLI and gave a bit more VRAM headroom for pushing out some of the customised shadow features even further. Meanwhile, for Elite Dangerous, I was able to max everything out on the 980 except for one AA-related feature which did indeed go beyond 4GB usage; the slow down was not a performance hit, it was a VRAM hit and it was very significant (ie. speedwise, two 980s would not have helped at all); backing the setting off just one notch pulled the RAM usage down to functional levels and voila, totally smooth, 60Hz minimum vsynced (I'm using all in-game settings maxed except for one, while in NCP it's using 8x AA, 8x Transparency (SS), 16x AF, MFAA, High Quality & Clamp. These are my examples, that's what I found, and from what the OP poster says he may well benefit from the switch, but I clearly said he should investigate further before deciding. I don't know what degree of stuttering, etc. he currently gets with his config, but it if it's annoying him then chances are very good that using one 980 Ti instead will fix it, even if the overall avg goes down, and that absolutely can make an overall gaming experience feel better.

What is true is that some game engines handle going over the available max VRAM a lot better than others. Depends how it's been written. Techniques for data paging to avoid glitching have been around for more than 20 years, but I expect some coders use them, others don't (FSX for example is very badly coded; compare that to a Performer-based vis sim which can handle massive terrain databases with only 64MB TRAM because it's very smart how it accesses data). So the fact that one particular game (eg. MP) doesn't barf when the VRAM is used up doesn't really mean anything except that for that particular game the issue is not a problem, but it's wrong to extrapolate MP's behaviour to a generalisation about VRAM limits for any game & card. It's only overblown if for a specific game the issues aren't relevant, but for lots of games & scenarios they absolutely are, I've seen it personally.

Whether or not the OP's setup is giving him sufficient grief to warrant a change based on his tolerance levels is up to him.

Ian.

PS. TH didn't trim the link btw; instead, trying to access the page just gives, "Error establishing a database connection".

 


It is very overblown TBH but it also has been underrated as well. I remember when GTA IV came out and people with Core 2 Duo E8400s overclocked to 4GHz, 1GB of system RAM and a 8800GTX 768MB had issues running it on medium settings. But my overclocked 3GHz Q6600, 4GB of system RAM and HD2900Pro 1GB could run the game maxed out, minus AA, happily. Back during those days things were sort of overlooked. We sat between 256MB and 1GB of VRAM for quite a while.

Now we have a very different world again. Consoles now have better hardware and game developers can utilize more features that will need more VRAM. Plus it is damn cheap. A R9 290X 8GB is $350 bucks, most R9 290X 4GB are $340 bucks unless you find a rebate.

I guess time will tell but I would say that 4-8GB of VRAM will be the sweet spot for most games for the next 5 years.
 

Madmaxneo

Reputable
Feb 25, 2014
335
2
4,810


After reading that explanation I think I may have a clue that might help you in getting this figured out. But first it depends on what OS your running. If it is windows 8.1 or 10 then I know someone who has had the same problem. Instead of distracting this thread further please IM me (or start a new thread and IM me the thread link)...
 

uglyduckling81

Distinguished
Feb 24, 2011
719
0
19,060

You mean the 970 3.5GB thing not the 980 right?

Am I missing something with your SLI comment. Have you figured out a way of enabling 3GB of VRAM through SLI that others have not? Created your own DX12?
 

corndog1836

Distinguished


https://www.google.com/search?q=980ti+vs+780ti&biw=1707&bih=835&site=webhp&source=lnms&tbm=isch&sa=X&ei=9KRzVaLaI4vXsAWh_oGYAg&ved=0CAkQ_AUoBA#imgrc=tLevGIl60bZCYM%253A%3BgKxXRDpROwzXtM%3Bhttp%253A%252F%252Fmedia.gamersnexus.net%252Fimages%252Fmedia%252F2015%252Fnvidia%252F980-ti-benchmark-gta-4k.png%3Bhttp%253A%252F%252Fwww.gamersnexus.net%252Fhwreviews%252F1964-nvidia-gtx-980-ti-benchmark-vs-780ti-980-titanx-290x%252FPage-2%3B954%3B471

980ti and 780ti are in this link
 
Status
Not open for further replies.