Nvidia GeForce GTX 980 Ti 6GB Review

Page 11 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

corndog1836

Distinguished


https://www.google.com/search?q=980ti+vs+780ti&biw=1707&bih=835&site=webhp&source=lnms&tbm=isch&sa=X&ei=9KRzVaLaI4vXsAWh_oGYAg&ved=0CAkQ_AUoBA#imgrc=tLevGIl60bZCYM%253A%3BgKxXRDpROwzXtM%3Bhttp%253A%252F%252Fmedia.gamersnexus.net%252Fimages%252Fmedia%252F2015%252Fnvidia%252F980-ti-benchmark-gta-4k.png%3Bhttp%253A%252F%252Fwww.gamersnexus.net%252Fhwreviews%252F1964-nvidia-gtx-980-ti-benchmark-vs-780ti-980-titanx-290x%252FPage-2%3B954%3B471

980ti and 780ti are in this link
 

corndog1836

Distinguished


https://www.google.com/search?q=980ti+vs+780ti&biw=1707&bih=835&site=webhp&source=lnms&tbm=isch&sa=X&ei=9KRzVaLaI4vXsAWh_oGYAg&ved=0CAkQ_AUoBA#imgrc=tLevGIl60bZCYM%253A%3BgKxXRDpROwzXtM%3Bhttp%253A%252F%252Fmedia.gamersnexus.net%252Fimages%252Fmedia%252F2015%252Fnvidia%252F980-ti-benchmark-gta-4k.png%3Bhttp%253A%252F%252Fwww.gamersnexus.net%252Fhwreviews%252F1964-nvidia-gtx-980-ti-benchmark-vs-780ti-980-titanx-290x%252FPage-2%3B954%3B471

https://www.google.com/search?q=980ti+vs+780ti&biw=1707&bih=835&site=webhp&source=lnms&tbm=isch&sa=X&ei=9KRzVaLaI4vXsAWh_oGYAg&ved=0CAkQ_AUoBA#imgrc=tLevGIl60bZCYM%253A%3BgKxXRDpROwzXtM%3Bhttp%253A%252F%252Fmedia.gamersnexus.net%252Fimages%252Fmedia%252F2015%252Fnvidia%252F980-ti-benchmark-gta-4k.png%3Bhttp%253A%252F%252Fwww.gamersnexus.net%252Fhwreviews%252F1964-nvidia-gtx-980-ti-benchmark-vs-780ti-980-titanx-290x%252FPage-2%3B954%3B471


980ti and 780ti are in these links
 

corndog1836

Distinguished


yes siirrrrrrrr....
 

corndog1836

Distinguished


 

corndog1836

Distinguished


BOOM!!! you da man.
 

boju

Titan
Ambassador


What you say is what i would want to hear, but i have a feeling a lot of dev's would focus more of their time toward Console side of things rather than worry about pc capabilities apart from other limited dev's whom would totally showcase PC hardware. There are a few out there but not enough to really tell the difference between highly detailed exclusive Console vs exclusive PC.
 

Arabian Knight

Reputable
Feb 26, 2015
114
0
4,680


lol sadly the oil is in the pockets of the kings ... we work hard to earn by the way ... dont think the people get anything from that oil .

The only country that split the oil among the people was Libya by the way , and you know what happened to that country ...

anyways no need to go off topic ... lets stay on the subject.
 

uglyduckling81

Distinguished
Feb 24, 2011
719
0
19,060


You should probably ask some Libyans how much of that money they actually received. Libya had a policy to share it but there wasn't much follow through on that particular policy as far as I can find.
 

SreckoM

Reputable
Jun 8, 2015
2
0
4,510
I'm writing this as seriously as I can, not being a fanboy: What is the purpose of the Titan X at this point? It lost its DP performance that made it a fantastic workstation-gaming hybrid. Also, it really sucks for people who bought a Titan X just a little over a month ago? That's ~$350 down the drain pretty much. Yea the Titan X has all that extra VRAM, but for what? 3 4K displays maybe, at which point a 980ti SLI would probably lose by about ~5% due to a few less CUDA cores.

Again though, for most customers, the 980ti is the obvious choice. I just feel like nVidia totally screwed over most of their Titan X customers now. And why? Well, I really think the 980ti will be the cheaper answer to AMD's Fury or whatever Fiji will be called, Really interested to see how it will do. If Fiji beats the Titan X/980ti, it's rumored $800 price point would make the 980ti a somewhat compelling offer depending on how well it does.

In the end, I'm loving this competition!
I'm writing this as seriously as I can, not being a fanboy: What is the purpose of the Titan X at this point? It lost its DP performance that made it a fantastic workstation-gaming hybrid. Also, it really sucks for people who bought a Titan X just a little over a month ago? That's ~$350 down the drain pretty much. Yea the Titan X has all that extra VRAM, but for what? 3 4K displays maybe, at which point a 980ti SLI would probably lose by about ~5% due to a few less CUDA cores.

Again though, for most customers, the 980ti is the obvious choice. I just feel like nVidia totally screwed over most of their Titan X customers now. And why? Well, I really think the 980ti will be the cheaper answer to AMD's Fury or whatever Fiji will be called, Really interested to see how it will do. If Fiji beats the Titan X/980ti, it's rumored $800 price point would make the 980ti a somewhat compelling offer depending on how well it does.

In the end, I'm loving this competition!
I'm writing this as seriously as I can, not being a fanboy: What is the purpose of the Titan X at this point? It lost its DP performance that made it a fantastic workstation-gaming hybrid. Also, it really sucks for people who bought a Titan X just a little over a month ago? That's ~$350 down the drain pretty much. Yea the Titan X has all that extra VRAM, but for what? 3 4K displays maybe, at which point a 980ti SLI would probably lose by about ~5% due to a few less CUDA cores.

Again though, for most customers, the 980ti is the obvious choice. I just feel like nVidia totally screwed over most of their Titan X customers now. And why? Well, I really think the 980ti will be the cheaper answer to AMD's Fury or whatever Fiji will be called, Really interested to see how it will do. If Fiji beats the Titan X/980ti, it's rumored $800 price point would make the 980ti a somewhat compelling offer depending on how well it does.

In the end, I'm loving this competition!

Well as 3d artist I must say that rarely my work fits in 6GB of RAM. As a matter of fact I have 32GB on my machine and when using CPU rendering, most of my work goes in between 16GB - 24GB of RAM usage while rendering.
So for me, for example, Titan is basically only choice. Speed is almost same, but RAM is REALLY important here. As you can not combine RAMs but you can cores.
 

Vlad Rose

Reputable
Apr 7, 2014
732
0
5,160


Wouldn't a Quadro suit you better then for rendering due to the better precision performance? I know there is a large price difference there, which is probably the reason for the Titan. If used at a professional large scale business though, it'd be Quadro's only (for Nvidia at least).
 

SreckoM

Reputable
Jun 8, 2015
2
0
4,510
Well that is only con, but that does not affect rendering process at all, AFAIK. Equivalent to Titan regards number of cores is Quadro 6000, that costs around 6000 euro :D
 

Vlad Rose

Reputable
Apr 7, 2014
732
0
5,160


Only once the compatibility/optimization issues are worked out between the hardware first. As it currently sits, those 2 games are only optimized for the 900 series of nvidia cards; nerfing all AMD and lower gen Nvidia hardware.
 


It depends on the software and the type of rendering. Even if it supports CUDA it may still need RAM to store data in as the GPU processes it.



I honestly don't think it was nerfing, at least on the AMD side. I have played plenty of games without any issues that people think were nerfed by NVidia. I honestly think AMD is just slacking in the game department. In the past few years there have only been a handful of AMD Gaming Evolved major titles.

I can't really blame them. I mean it isn't easy to fund Gaming Evolved when the company is losing executives and money like no other.
 

Vlad Rose

Reputable
Apr 7, 2014
732
0
5,160
I just meant nerfing as in the performance isn't there yet on other hardware because the designers hadn't tested with them.

The entire situation seems kind of like a lose/lose for everyone with AMD anyways as they were not on the ball about having the problems fixed before release; but yet it's hard to blame them for not wanting to support nvidia licensed tech.

The biggest surprise is that the Nvidia 790/etc (previous generation high end cards) lost to mid end cards (960) from the current generation; on only those particular titles.
 


It is surprising yet it could also be utilization of newer features of the GPU. It used to be that way. When a new line was introduced by AMD/NVidia the newer games that utilized the features of the new GPU line would lose out to the mid-high end (a 960 is mid-high not mid). Now all but the two top end GPUs are just rebrands most of the time.
 

mapesdhs

Distinguished


Yes.




Heavens no! :D I just meant that I replaced the two 1.5GB 580s with two 3GB 580s, then later switched to a single 980.

Ian.


 

mapesdhs

Distinguished


Funny, my Quadro 6000 only cost 500 UKP, but there ya go. :D

Btw, never judge based on the number of cores, that simply does not work at all. GPUs vary in their design, the best example being the 580. A Titan is slower than two 580s for CUDA, yet two 580s these days costs diddly squat. A K5000 is slightly slower than 580 for CUDA, a Quadro 6000 slower still. Titan is way beyond them both (no. of cores is not a measure of performance when comparing across different designs).

Ian.

 

uglyduckling81

Distinguished
Feb 24, 2011
719
0
19,060


Ah, I was unaware a 3GB 580 existed.
 

mapesdhs

Distinguished

Heh, no prob. Feel free to bask in the past gloriousness of four of them. :D All MSI LXs, which are enormous cards; stock 832MHz, I run them at 900, though they'll do much more (faster than two Titan Blacks for CUDA; my system is at position 18. NB: entry15 is listed wrong, it should read 3x 780).

 

randomizer

Champion
Moderator


Assuming you meant double-precision FP performance, not all renderers use double-precision floats so in those instances the decision between a Geforce or Quadro would come down to VRAM.
 

mapesdhs

Distinguished


True, though Quadro cards do support a broader range of colour depths, features such as ECC (some of them anyway), etc., and there are reliability differences. However, for buying a Quadro, I'd just hunt the used market, some good value there, companies dumping the older high-end models which are still very potent. I bagged a K2000D recently for only 78.

Another option is to combine the two: Quadro card for the primary display (to have the better OGL support, colour range, features, etc.), plus one or more gamer cards for CUDA (3x 580 is cost optimal, 3x Titan Black or 780 Ti have the best combined speed + app support, 3x Titan X the most powerful if one has the moolah, 3x 980 Ti a cost compromise assuming MW CUDA V2 is supported), though if the cards are evenly matched for raw CUDA performance then all 4 can be part of the CUDA pool (eg. K5000 + 3x580). Only caveat is to use identical driver releases to ensure max compatibility.

The other issue is finding sw that uses CUDA, or in the case of video editing sw, finding an app that's reasonably reliable at all, whether or not it can use CUDA. Been looking into editors today, just about all of them get panned in review comments to some extent for being buggy. Looked at AVC Ultimate, AVI Demux, Freemake Video Converter, Free Video Editor, MPEG Streamclip, etc. (not impressed with Pinnacle Studio, Sony Vegas, etc. after reading so many user moans about stability problems; such buggy sw is not acceptable when it costs rather a lot). Anyway, I digress... a minor rant after 2 hours of app hunting... :}

Ian.

 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290


Not quite... It's well known that most driver optimizations done by both AMD and Nvidia are based on binary builds and not on the source code. It's also well established that regardless of the game or workload, GCN simply doesn't have the tessellation performance of Kepler, much less Maxwell based cards. This honestly has much less to do with driver optimizations, and far more to do with substantially lower tessellation performance on AMD cards, and that's exactly why Nvidia jacked the tessellation levels so high by default. That statement about Nvidia sabotaging performance by denying AMD the ability to properly optimize their drivers is just AMD PR/marketing BS. Its only purpose was to get uninformed gamers riled up against Nvidia for the wrong reasons.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290

Oh gosh, there are still people saying this? Have you missed every article covering the Maxwell architecture for the past 9 months? There is no memory bottleneck, and certainly nothing so substantial that it would warrant the tradeoffs of a significantly wider bus like you're suggesting.
 
Status
Not open for further replies.