Nvidia GeForce GTX 980 Ti 6GB Review

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Many modern GPU's are designed so they can easily disable DEFECTIVE areas, or in some cases working areas to meet a quota (probably disabling on working not common).

So they would make the TITAN X chips with full 3072 CUDA core count but over time they'd get a pile of ones with defects since the manufacturing process IS NOT PERFECT.

These are resold in a cheaper card like the GTX980Ti.

This is a perfectly great way to do this. The GPU's are tested, and even if you do get issues later that's what Warranties are for. Far better than throwing in the GARBAGE which would then force them to charge MORE MONEY to offset losses.
 
Except for the 980 Ti, 980 (and arguably 960), there is no point in buying other Nvidia cards today.

AMD has better priced cards with faster performance at every price point.

290X is faster than 780 Ti and 970.
290 is faster than 780
280X is faster than 770/960
270X is faster than 760
260X is faster than 750 Ti
250 is faster than 740.

295x2 faster than Titan X/980 Ti whatever and costs less than either of these cards. Hell, it occasionally costs less than a 980. And XFire drivers come eventually.

I'm not paying half a grand and not have some time to wait for a proper driver. I can deal.

Most of what was said is incorrect. 780Ti is better than r9 290x. 290x and 970 are pretty much identical in performance, but the 970 could achieve with just a tdp of 170w. 280X and 770 perform about the same. 960 performs slightly less than both, but is more power efficient and costs less.Gtx760 is faster than 270x. 750ti outperforms 260x. 295x2 is faster than 980ti but simply doesn't work right in many games with high frame variance and lack of support all together. I cant see anybody taking 295x2 over 980ti if they are priced similary considering the 295x2 eats twice as much power and is dual gpu graphic card.
 

Arabian Knight

Reputable
Feb 26, 2015
114
0
4,680


Why Rude ? they DO work 10 times harder ... they work 14 hours daily , and even work Sundays and Saturdays so they can earn enough just to buy a cheap car and pay for Home debts. and all that for just 100$ more per month.

wake up and smell the coffee ....

no it is not Rude it is Fact.
 

Arabian Knight

Reputable
Feb 26, 2015
114
0
4,680


who cares about your statistics ? I am talking about the hours per day they have to work to earn enough for living. like 12 to 14 hours DAILY instead of 6 or 8 as in rich countries !

and all that for just 100$ more salary ...

even work every week end ...


 

Vlad Rose

Reputable
Apr 7, 2014
732
0
5,160
How important is gaming to you if you spend $500 on GPU and $500 on a monitor?

I guess i am the only one paying for rent out there.

I don't see how you got thumbed down so much. Most people don't have mommy and daddy to be able to pay for everything for them. And those with decent jobs usually have college loans they're required to pay (electrical engineer; half my paycheck instantly taken for my loans). If I had an extra $1000 laying around, I can guarantee it'd go into my car so I can reliably go to work instead of something strictly for recreation.
 
Card is over $800 here in Canada!!!!

That is what I was expecting for that card in the us. Even if it was $800 here in us, it doesn't have an real competition yet at this price point. Can't wait to see what Amd's Fiji based is going to perform like. I guess those rumors of it being priced at $800 usd are dead considering the price of the 980ti.
 

cknobman

Distinguished
May 2, 2006
1,167
318
19,660
Nvidia must have seen some undisclosed AMD benchmarks, went into panic mode, and rushed a release for the 980TI to get customers before the AMD launch.

While its a great card the problem is Nvidia screwed some of their own customers.

I take this as a sign that whatever AMD is coming out with must be pretty good. :)
 


Doesn't change the fact that I am averaging at 45FPS at 2160p with 290x in crossfire. Their benchemark for The Witcher 3 are irrelevant.

Lowest 34FPS
Average 45FPS
Highest 54FPS

I don't think I am the one with the fanboy glasses. Tom screwed it and since The Witcher 3 is now a new benchemark to achieve I am questioning the input on it at all. Right now AMD CF is beating Nvidia SLI with the proper patch. The number povided in the becnh clearly demonstrate that CF is not activated.
 


You can have two 290x for that price which is a big dilema.

The 390x based on HBM is was I am the most curious to see in a long time. Right now the biggest obstacle with 2160p is the memory. It might solved everything or screwed everything, who knows, ut that's a proprietary hardware design and we are not seeing those often anymore.

 
I came around a hill, looking for hidden treasure, expecting to find some bandits, or maybe some other humanoid creature. Oh nos! Goose bumps! It's a fiend! Not just any fiend either. One with bright red hair, swaying in the breeze. It's corpse was quite the prize.

The battle gave me excitement, nvidia's hairworks gave me the goose bumps.

Intentionally cheesy lol but 100% true!
 

Eggz

Distinguished


I found HairWorks only slightly better looking than normal. It's really slow! So I just turned it off, and I'm not the kind of person who turns things off very often. But this particular tech just isn't ready.
 
You can have two 290x for that price which is a big dilema.

The 390x based on HBM is was I am the most curious to see in a long time. Right now the biggest obstacle with 2160p is the memory. It might solved everything or screwed everything, who knows, ut that's a proprietary hardware design and we are not seeing those often anymore.

Agreed. The 390x hbm memory has me very curious. I am interested to see if they shrunk the die to 20 nm. I am guessing they did because if it was a 28 nm it would be available awhile back. Nvidia already put the best foot forward in the titan x. I would what would be done if 390x is significantly better titan x by nvidia? Wait until pascal?
 


This is true but not as true as you think. Just being in the USA does not guarantee a great salary. Hell I just finally got a decent paying job and that is after looking for 10 years.

But on the flip side a hobby is a hobby. Even in the US there are more important things to spend money on. I personally would never spend $1K on a GPU and monitor alone because I have rent/mortgage, bills and car payments.

The GTX 980Ti did just what the 780Ti did, and made it pointless to go for the Titan grade GPU.

Of course we still have to wait for AMDs offering, hopefully we will know soon as I am getting antsy to upgrade my HD7970GHz (this has me half tempted to go for a R9 295x2) to something that can push some of the newer games to 60FPS at 1080P.
 

Aspiring techie

Reputable
Mar 24, 2015
823
9
5,365

Don't you believe that Americans are a bunch of spoiled bums that work a little for a big pay. If that were the case, then America wouldn't have the most powerful economy in the history of the world.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
Your Witcher 3 bench are off. You didn't install the right 15.5 patch and used the leaked one.

My Crossfire setup with two 290x is averaging at 45FPS. with AA and Hair deactivated at 2160p. Of course, if you activate HairScam, then of course the AMD card crumble...

The worst is that it is not even good looking.

The drivers came straight from AMD, and HairWorks was deactivated for the testing. AA, however, was used, which AMD's release notes clearly mention affect performance. AMD was notified prior to today's story, and we're waiting on comment as to the technical explanation for its performance.
 

xenol

Distinguished
Jun 18, 2008
216
0
18,680
I kind of don't understand the stigma against 4K at lower screen sizes. It kind of makes having a 4K monitor pointless then because ergonomically speaking, anything above 32 inches isn't very good for most desk usage. I'd rather not have to move my head around to see something.

Or maybe I should ask, what's with the stigma against 96 PPI based monitor usage? We praise high density displays and how they look on our phones, but apparently this isn't allowed on desktop computers.
 

wolverine96

Reputable
Mar 26, 2014
1,237
0
5,660
Thank you for the Blender rendering results! This is the only thing I read, LOL. :D
And thank you for setting the tile size to 256x256 this time. That makes a big difference!
 

MonsterCookie

Distinguished
Jan 30, 2009
56
0
18,630


Well, they will most probably thumb you down too, as they thumbed me down when I complained that as an IT engineer with a master degree I could never-ever afford some of the NEC monitors from my EU salary.

I am in the same shoos as you are, and I simply do NOT understand what I am doing wrong in my life.
I was studying like an idiot, got a degree and Linux certifications, instead of partying hard like the cool kids. Speak 4 languages too. I am not even young anymore dangit, already middle aged, and still not printing money here...

Actually I got so angry, that I have put a CV together just as wen I was hunting for my first job 8 years ago. I thought, that if things are going so well in the US, than I am more than willing to move to the US for a job. Especially if they put me in the money-shower, like they have to do to these other guys.

To comment on the GTX 980 Ti as well, I would say that the performance have jumped quite a lot compared to the vanilla GTX 980.
Those who love AMD so much, please explain me why AMD is unable to release a half-decent driver for Linux? If they could release a good driver, I would honestly consider AMD, but with the current support they can forget it.
I hope that AMD has something good cooking in their dungeon ;)
 
not all people live in USA and have good salaries.

and those work 10 times harder than any one in USA ... but live in a Poor country.

you are selfish and rude people. very rude !



Here...let me fix this for you:

- not all (of the) people (that) live in USA have good salaries.

- and (we've never worked in America, but we believe) those work 10 times harder than any one in USA ( because this makes us feel better to hate people we don't know ) ... but live in a Poor country

- you are selfish and rude people (not that we know any of you, but this hate makes us feel better about ourselves ). very rude ! ( we've been told )

There you go. Me putting words in your ignorant mouth might be rude...but since you already don't like Americans...
 

MonsterCookie

Distinguished
Jan 30, 2009
56
0
18,630


I am not quite sure I got what you mean, but frankly, I have to say that having small (30-32 inch is SMALL) monitors with 4K resolutions is utterly pointless. If you try to do any production oriented application, like try to lay out a printed-circuit-board, open a complex network diagram, open an engineering tool,
open several Linux shells next to each other, you will not see ANYTHING at all. You would rather move your had/eyes to the area where you have to concentrate, and forget the rest, just blink over it to make sure that things are right.
Or, if you watch a movie, you can sit back on your sofa, and still make out the action.

Am I the only one who during the working day has to put eye-drops in every 2 hours, because my eyes are red from small screens?
 
I kind of don't understand the stigma against 4K at lower screen sizes. It kind of makes having a 4K monitor pointless then because ergonomically speaking, anything above 32 inches isn't very good for most desk usage. I'd rather not have to move my head around to see something.

Or maybe I should ask, what's with the stigma against 96 PPI based monitor usage? We praise high density displays and how they look on our phones, but apparently this isn't allowed on desktop computers.

I guess people are starting to warm up to it. Pretty much ever manufacture has a 4k 28 inch monitor. I think most consumer don't want 4k screens because of how badly Windows scales with that resolution.
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990
I don't forget you ;)

 


Gotta take off the rose colored glasses....

perfrel_1920.gif


And while the 290x is faster "outta the box" than the 780....because the R9's are so aggressively overclocked in the box, once installed and OC'd the 780 is clearly faster (8:40 mark)

https://www.youtube.com/watch?v=djvZaHHU4I8

The slogan when it was released was that the cooler was holding it back and that the 290x would show it's stuff when water cooled... However, when both water cooled, the green lead gets wider(4:40 mark).

https://www.youtube.com/watch?v=RqaHh-y51us

And before we start talking about "But at 4k.....", let's remember that 4k represents just 0.6% of the market. 1920 x 1080 has 34.62%. 4K res won't be relevant as a market force at least till XMas 2016 when two cards in SLI / CF will handle all the latest AA games at 144 Hz at 60+ fps.

If what you are saying was true, AMDs financial position would be a lot better than it is and nVidia wouldn't own 76% of the discrete card market.

No point in buying ? And yet the 970 all by itself, and in about half the time on market, has captured 53% greater market share than all 16 R7 and R9 cards combined. It would seem that most people are seeing a different point.

http://store.steampowered.com/hwsurvey/videocard/

NVIDIA GeForce GTX 970 = 2.81% of cards hitting steam servers

AMD Radeon R9 200 Series = 0.94%
AMD Radeon R7 200 Series = 0.89%
Total All AMD R7/ R9 Cards = 1.83%

Over the 3rd and 4th quarters of 2o14period, AMDs share of the discreet card market has dropped 37% from 37.9% to 24%

bNqJYgA.png


On the good news side, one has to wonder if the 980 Ti release before the 390x tells us anything. I half expected that nVidia would hold back like last generation and wait for AMD to release their product and then release the faster product to take the air out of AMDs sales. Now that nVidia has put there's out there first, one has to wonder if their thinking is they won't be able to topple AMD so decided to get the buzz going on the Ti while they can. That would be good news as it should help move AMD to a more competitive position.... Nvidia spent am incredible 31.3% of sales on R&D in 2014. If AMD can fix freesync and "take the title", they should be able to recover some of that ground.
 

Vlad Rose

Reputable
Apr 7, 2014
732
0
5,160


I feel your pain. USA isn't the answer though; Michigan at least where I live. 2 degrees, EE and CS, stuck as a contract engineer for an automotive company with my college loans taking out enough to were my take home pay is just above minimum wage. What is really sad is that my wife is Chinese and I'm actually considering moving there just to find a better job because there just isn't one here.
 
Status
Not open for further replies.