Gigabyte GeForce GTX 980 Ti Xtreme Gaming Windforce Review

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

ael00

Honorable
Feb 12, 2013
230
0
10,710
Biased, yes.

But yet it is a FACT that amds run a lot hotter and thus are generally louder, and that the whole 9xx series was a lot more power efficient.
 

akula2

Distinguished
Jan 2, 2009
408
0
18,790
@Games are tech!

Really? It should be corrected as:

Games are NOT the only tech.

In my case, I use only three Gaming GPUs compared to 100+ in whatever am doing in the STEMM domain.

My primary concern is disappointing TDPs of 980Ti GPUs compared to 980s. No wonder why I chose 980s over 980Tis to upgrade two dozen aging workstations. I look at TDP saving from each GPU without much trade-off on Performance. That begin said, 980Ti has its own Pros too so I bought a few of them. However, I'm not at all happy with the Asus ROG 980Ti Matrix card from pricing and aesthetics point of view.

ps: same disappointing feeling (TDP) about Skylake high-end CPU compared to Devil's Canyon CPU.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


No, toms is the odd one, EVERYONE else hit the lottery...LOL. Or, to be more accurate toms rolled craps while everyone else is in the norm, with some odd ones hitting really high for speeds (1400+, 8.2ghz mem etc). Toms hit terrible everything, which is NOT relevant to me. They are the ONLY one hitting it, which merely proves not everyone is NORMAL or awesome, some get CRAP. You should expect better than toms according to EVERY other site that has reviews these cards from many vendors. I challenge you to find another site hitting 3 lows (core, boost and god awful ram). At this stage in the game yields etc should be improved and this case is even more strange IMHO (processes improve, they don't get worse 6mo-1yr later). They binned for months before initial release previously, but I doubt they have to bin at all now as process improves far later here at basically the end of 980ti's lifespan. I'm not talking bought on shelf here really (which goes months beyond last chip made, maybe longer), but rather how much time they'll keep making them as they'll be starting new chip production for the next cards shortly and stop all old series production, piling up for new launches from both sides. Nobody wants to right off Millions in chips like AMD recently had to do because of massive overstock of old chips.
 

kcarbotte

Contributing Writer
Editor
Mar 24, 2015
1,995
2
11,785


I will have to amend the overclocking section. I failed to list the observed peak clock speed.
Though the boost clock is listed at much lower, according to the GPU-Z logs, the card peaked at 1502MHz.

While the memory speed that we achieved left much to be desired, I have no qualms with the GPU overclock. I stand by my award, and I stand by my recomendation.
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990
t was somewhat surprising to see just how much power the card uses at idle, though. Drawing 26.2W puts it in the same range as PowerColor’s overclocked R9 390X, a card well-known for posting big consumption figures.

It is very simple, you are using the wrong driver. All newer drivers have an idle issue on X99 boards. I figured this out together with Nvidia one week ago and the review (and Nvidias statement) can be read here:
http://www.tomshardware.de/nvidia-treiber-idle-leistungsaufnahme-bug,testberichte-242017.html

The correct idle power consumption is 13 watts, not 26 watts. :)

I have a card with the same PCB, it can run stable above 1500 MHz if you increase the voltage with MSI afterburner 4.2 with +86 mV and reduce the PLL voltage (called Aux, -100mV). Use the LN2 BIOS (button on the end of the card) to get better OC results. :)
 

kcarbotte

Contributing Writer
Editor
Mar 24, 2015
1,995
2
11,785


Good to know.
I will have to retest the card when I get a chance.
 
What's with the "Wait for Pascal" train?!!!

Seriously. It's a 9 month wait, especially with The Division coming around the corner. If someone is on Kepler they can wait. But if we're talking about upgrading from Fermi... They need the upgrade ASAP! Fermi GTX 590 isn't going to do well in newer games! And you guys are suggesting wait? That card is 4 years old. Or perhaps they are building new? Why make them hang around for new games?

What I mean is, why should one wait. There are so much good stuff out there like the non-X Fury. Waiting for the next gen is a long time away, and you're telling me that Pascal is much more important than being able to play newer games for 9 months?
 

kcarbotte

Contributing Writer
Editor
Mar 24, 2015
1,995
2
11,785
What's with the "Wait for Pascal" train?!!!

Seriously. It's a 9 month wait, especially with The Division coming around the corner. If someone is on Kepler they can wait. But if we're talking about upgrading from Fermi... They need the upgrade ASAP! Fermi GTX 590 isn't going to do well in newer games! And you guys are suggesting wait? That card is 4 years old. Or perhaps they are building new? Why make them hang around for new games?

What I mean is, why should one wait. There are so much good stuff out there like the non-X Fury. Waiting for the next gen is a long time away, and you're telling me that Pascal is much more important than being able to play newer games for 9 months?


Exactly why this review, and card in general, is relevant.
 

AgentOrange82

Reputable
Jan 26, 2016
3
0
4,510
Just adding my two cents to the extremely (or should I say, "xtremely") low overclock that was produced in this review. As has been stated in the comments, I too am boosting comfortably over 1500mhz on the core, and the memory overclocks massively too, ~8mhz.

Coming from 2XR9 290s to this has been a godsend. I've tried to support AMD for as long as possible but it was getting too darn frustrating. I get similar - or better in some cases (Gameworks) - performance in almost every case at 4k than with the 2 290s, plus it is FAR cooler and quieter. It's a shame I've had to go away from AMD for now but hopefully they can kick some goals in the next generation.

It was totally worth upgrading now. Sure, Pascal will be here this year, but as one comment said, in what form? As in, will it eclipse this performance off the bat? If so, awesome! I'll consider selling this beast and getting a new Pascal. But until then, I am absolutely loving this 4k performance and games have never looked this good.
 

AgentOrange82

Reputable
Jan 26, 2016
3
0
4,510
Oh for god's sake my I was looking at the forum on my phone and it got stuck and I was just pressing buttons. I'm not that neurotic. And as someone else said, it shouldn't even be possible.
 


You get the redemption award for this month, heh.
 

AgentOrange82

Reputable
Jan 26, 2016
3
0
4,510


Thank you, haha. For the record: I was trying to edit the post on my phone and the buttons are not that clear on mobiles. So I was pressing buttons and of course pressed the upvote. I then tried to downvote it, because as has been shown here, it's not cool at all to upvote yourself. My phone then seemed to not respond and I couldn't press anything. Turns out I don't think I can edit posts on phones.

Now when I've tried to downvote my upvote on PC here it won't let me vote on my own posts. Yet it somehow let me on my mobile.

Anyway, bloody hell! All I wanted to say was I think it's a great graphics card!
 

iancuio

Reputable
Jul 30, 2015
17
0
4,510
Lol at the R9 295x2 winning most of the benchmarks handily.

A fair comparison would be to compare the 295 to 2 Gigabyte GeForce GTX 980 Ti Xtreme Gaming in sli at 4k.
Lol at the R9 295x2 winning most of the benchmarks handily.

A fair comparison would be to compare the 295 to 2 Gigabyte GeForce GTX 980 Ti Xtreme Gaming in sli at 4k.
Lol at the R9 295x2 winning most of the benchmarks handily.

A fair comparison would be to compare the 295 to 2 Gigabyte GeForce GTX 980 Ti Xtreme Gaming in sli at 4k.

No, it won't be. Because 2*980TI's would cost 3 times as much as a R9 295X2. Only a blind man or the son of Bill Gates would dismiss that argument. Case closed.
 
You're a fanboy. If you weren't you would've said the other way. You would've said that 2 GTX 980 Ti wrecks the R9 295X2.
Welp, I guess that makes me a NVIDIA fanboy... xD

Scratch all that. Argument isn't over.
 

iancuio

Reputable
Jul 30, 2015
17
0
4,510


Why am a fanboy? You have no idea what you are saying. Why would I even consider 2 GTX 980TI's in balance with 1 R9 295X2 cards? Let's thake it from Newegg.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127889&cm_re=980ti-_-14-127-889-_-Product - 659$. One single 980TI is 150$ more expensive than the R9 295X2.

So why would I consider 1318$ vs 500$? Just because performance. Yeah, <removed by mod>that. Nvidia SLI is not even better than AMD Xfire.



 


No, he knows exactly what he's saying. As do the rest of us. You are comparing a dual-GPU card to a single-GPU card. That's NOT the same comparison no matter the price difference. GOT IT???
 

iancuio

Reputable
Jul 30, 2015
17
0
4,510


No. Such arguments hurt me :(

The thing is... you guys only have that attitude because the toughest Nvidia card is beaten by AMD at a cheaper price, and it is 3 years old.

Again, as I said. For me, and 99% percent of the gamers out there, we are governed by the price/performance, not pure performance. Not by brand, not by who runs hotter, not by who consumes less (because it is also insignificant), not by brand, not by OMG HOW CAN YOU COMPARE A DUAL GPU CARD VS A SINGLE GPU CARD I MEAN YEAH IT'S 3 TIMES CHEAPER BUT NVIDIA MASTER RACE LOLOLOL?!!?!?!?!?.

Usually, people you call "AMD fanboys" someone who sides with AMD are informed, bring solid arguments (USUALLY) to what they say.

Nvidia fanboys are people that only side with brand, ignore every fact just because Nvidia, ignore every fault just because Nvidia. Read some threads ^^ Usually it goes like this:
"Yeah, the R9 390 performs better than the 970, it is cheaper, it is more future proof because of memory bandwidth, and because of DirectX12."
Nvidia fanboy: "HAAHAHA OMG GO F**K YOURSELF AMD FANBOY THE 970 IS OBV BETTER IT RUNS COOLER AND IT CONSUMES LESS". And they dismiss all arguments.

Enjoy your 900 series while you still can, because they have already started to degrade performance in games to make a bigger difference between the 900 series and the new ones coming now (The same as they did with the Keppler)

They already started with Fallout 4 btw:

http://a.disquscdn.com/uploads/mediaembed/images/3136/6292/original.jpg
 


God you ooze of AMD fanboy. First of all, since you want to attack and get personal, my SLI 970s are still extremely powerful for my 1440p rig (overclocked to 980 performance).

Second, I actually have a job and make a decent living and can afford to upgrade every two years, and will be selling my 970s and upgrading to Pascal. This summer will be two years I've had them. They will have gotten their use well by then when the 10xx cards come out. So your point about my cards being outdated soon is moot (not that they won't last for several more years in the latest games or anything).

Third, I've owned both AMD and Nvidia, and Nvidia on the high end rules, but AMD on the mid-tier rules. I am on record here recommending an R9 390 over a 970 at the same price point. But then there are things like AMD cards not running well on Project Cars and Microsoft FSX.

Fourth and finally, I'm sorry...what has your contribution been around here sport? Come back when you grow up, kid. Until then, you can talk to the hand, little petulant potty mouth.

∩∩∩∩╮(︶︿︶)
 
Status
Not open for further replies.