Nvidia GeForce GTX 970 And 980 Review: Maximum Maxwell

Page 9 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Ra_V_en

Honorable
Jan 17, 2014
1,296
0
11,960


They seem to optimize the GPU usage in gaming environment... but what about that torture run.
When i see the actual results of over 240W from GTX 970 and even more from 980 in the torture run, it seems like reference card seems to be designed very optimistically with that 2x6 pin connectors tho.
nVidia learned the lesson this time with energy efficient models... but once in awhile even a goliath gets a good idea... but the way they did is is not really sophisticated...time for AMD for quick adaptation and the slap back.
Lets wait and see the response from opposition...
 
Obviously Nvidea has done an outstanding job with these cards, kudos to them.

For me, at a resolution of 1920x1080, no need to upgrade to one of these cards, I am keeping my R9 290.
 

MasterMace

Distinguished
Oct 12, 2010
1,151
0
19,460


That's not what the benchmarks are showing. The 980 and 970 perform very well at 1080, but slow down heavily on the 2160 benchmark compared to the 780ti etc.
 

I'm seeing a consistent 7% performance difference between the GTX 980 and GTX 780 Ti at 1080 versus 2160 resolutions. Therefore no relative difference, they both incur the same hit. The 290X gives a strong showing, going from -19% to -10% when you bump up the resolution.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980/26.html
perfrel_1920.gif

perfrel_3840.gif

 


The GTX 980 is further ahead of the GTX 980 at 3840x2160 than at 1920x1080.
I don't know why you look at performance difference and conclude it is because of one piece of the pipeline anyway.

Here is a quote from the review:
The GeForce GTX 980’s raster processing appears particularly well-endowed, with 64 full-color ROPs per clock and 72 Gpixels/second of fill rate. Compare this to the GeForce GTX 780 Ti's 48 ROPs per clock and 44 Gpixels/s. That should give GM204 an edge when it comes to high resolutions (like 4K) and anti-aliasing.

On top of this, the maximum resolution of the GTX 780 Ti is 3840x2160 @ 60 Hz while it is 5120x3200 @ 60 Hz for the GTX 980.
 


There are benchmarks at 4k resolutions. On the graph images, there are multiple pictures you click through like a slide-show. The first series of images are related to 1080p gaming as it is the most common gaming done, the second set is 4k which is the area we are moving into. They don't bother with stuff in the middle like 1440p and 1600p unless they have a lot of time, those are important just not as important as the 1080p and 4k.
 

j0ndafr3ak

Distinguished
Feb 11, 2012
409
1
18,815


Epic fail I guess lol. Thank you for pointing it out. That's why I don't like Web browsing on my smartphone lol
 


The recommended prices come directly from Nvidia who actually sets a price the GPU should sell for. If it is costing more than that, its either from sever lack in supply as they are moving out, or more likely the retailers are trying to cash in on people and raising prices.
 


Lol no problem, smartphones can be a pain that way. Long story short though it doesn't change much. They don't move up or down in relation to each other much as resolution changes, though the 780Ti does better at higher resolutions. If you are considering 4k displays make sure it has Displayport 1.2 or better or HDMI 2.0 otherwise you will be capped at 30Hz at 4k and the display will bottleneck you.
 

ThomV

Reputable
Sep 23, 2014
2
0
4,510
The photo on the noise test suggests that you're using a side-address LDC microphone but then pointing the top of it toward the graphics card...
 
 
You might want to clean that up, all those quotes it is a mess to know who said what.

Whoever was talking about resolution bragging wars, that seems to be a pretty small and hardly discussed topic. Intel certainly isn't trying to play itself as a big gaming GPU maker, there goal is to say they can give video and program support and excellent performance for those at high resolutions. Which they succeed in doing, they can do 4k with apps and videos just find.

However, they know openly that if you then turn on a game it will not work well at all at that resolution. They only target the bottom level of gaming. Nvidia and AMD both are much more capable of gaming at 4k and everyone openly acknowledges and accepts this, so I am not sure why you would look at Intel as competing here.
 

gigga101

Reputable
Sep 16, 2014
12
0
4,510
What sucks the most is waiting for yo gpu to come in the mail....... its a brutal,but i cant wait to test it out when it comes. With these kind of benchmarks probably best purchase for a gpu right now if you dont feel like shelling out more than $400
 

Wayne Anderson

Reputable
Aug 19, 2014
15
0
4,520
For VR, what is really intersting to me is when you correlate the lower latency of DirectX 12 based rendering with the pre-render potential of the VR pipeline, layered with the new aliasing technology. As long as the DSR doesn't add back the latency gains that the improved pipeline offer, it should be a pretty compelling combination for getting closer to the actual motion compution of our eyeballs. There is a practical limit here: the human eyeball limit. The more that we close the gap between the interface and the eyeball, the better the VR experience becomes (minimizing dizziness, nausea, etc).
 

tuinboontje

Reputable
Sep 24, 2014
2
0
4,510
How do you figure that? With the 780 and 1st titan amd responded with the 290(X). That offered titan like performance. Nvidia had to respond with ti's
 

mapesdhs

Distinguished
Not really, the 290/290X were too loud, too hot, used too much power, and DX9/CF
was and still is broken, plus the stutter issues. Custom coolers helped later on, but
I'd still prefer the 780/780 Ti any day (or rather now, the 970/980).

Titan is more specialised, better for pseudo-pro work.

Ian.

 
Ian , I think that not a lot of people Think like wise and myself included ( Heat and Power Consumption especially if they are upgrading from a good older system with lots of Wattage at hand then that becomes less of a problem ) . When I was shopping for a Graphics Card I immediately went for the 290's way as it delivered great performance for as little $360 which might seem now exaggerated since the 970 performs slightly better and for less but you got me , at the time the 290 was ground breaker.

Again after the release of Aftermarket coolers for the 290 series they were really great cards , the eventual drivers issues still aroused but from price/performance stand point they were always strong competitor and yet again were preferred by lots of people for their price and their superiority in Eye-Finity ( Surround Gaming ) accompanied by less price compared to their counterparts .

I really understand personal preference plays a role when choosing a GPU but you can't really say that the 290 series gave a blow to Nvidia's cards for a while as they really did that's why they came out with the 780Ti to dominate in the Higher End of the GPU market and AMD never really responded .
 

mapesdhs

Distinguished
Sorry but for me the noise/heat/power/DX9-CF issues ruined the 290s, and testing
CF today with 7970s I still find far more irritating driver issues with AMD cards that
put me off. The only advantage AMD had back then was price, but the choice of
what to buy for a lot of people goes beyond mere cost. I'd rather pay more for a
much quieter card, and oh how quickly people seem to have forgotten the awful
throttling/fan issues AMD had when the 290s first launched. Add in the stuttering
problem and no thanks. Is NVIDIA often more expensive than they need to be or
should be? Yes I think so, but the higher cost is worth the benefits IMO. AMD's
290s had an initial speed and price advantage, but in all other respects I cannot
agree they were ground-breaking (definitely not).

I have no interest in multi-screen gaming. Most gamers only have one monitor.

All this really shows though is that people have varying priorities. If one did not
care about heat, power, noise, stuttering or driver isues, then sure, the 290s were
a boon when they launched, but I want something better than that.

Of course the price advantage was quickly wiped out by the coin mining craze.
By the time that all settled down, the 780 Ti made much more sense IMO.

Ian.

 
The stock 290s were bad. They should have launched with a cooler that had a hope/prayer of keeping boost clocks. Aftermarket 290s are fine. the only real issue with them, and even then it's a stretch is the DX9/CF issue. I suspect AMD will never give us a driver that enables it. All they can really do is keep the fixing going forward. Most new games will be DX10+, and a single 290 doesn't have a problem with a single 1080 monitor for most DX9 games. It's something that matters to as many gamers as your multi monitor gamer. (Don't label me a fanboy, my last Nvidia card was the GTX460.)

I just hope AMD can launch something back and keep it going. They did great with Eyefinity and keeping the price/performance curve going. I just hope their answer doesn't take forever.
 

mapesdhs

Distinguished
4745454b writes:
> The stock 290s were bad. They should have launched with a cooler that had a
> hope/prayer of keeping boost clocks. ...

Indeed, I couldn't understand why they used such a bad cooler, it ruined what
would otherwise have been as you say a real game changer in overall competition.


> the DX9/CF issue. ...

I play several older games which are affected by this, so it does still matter to me.


> I suspect AMD will never give us a driver that enables it. ...

Well, they say they will, and I'm happy to take them at their word. Just seems
to be taking a long time for it to happen though.


> ... a single 290 doesn't have a problem with a single 1080 monitor for most DX9 games. ...

Very true. However, I like playing older games with the details cranked to the max.
Even with two 580s I had to dial back my custom settings for Crysis because it was
too slow. Sits about 45fps atm (1920x1200).


> It's something that matters to as many gamers as your multi monitor gamer. ...

Hard to know, they may be a significant but not particularly vocal group.


> ... (Don't label me a fanboy, my last Nvidia card was the GTX460.)

I wouldn't dream of such a thing, I have more than a dozen 460s. :D


> I just hope AMD can launch something back and keep it going. ...

Me too, we really need the competition. Today it feels like the whole pricing
structure of GPUs has moved upwards too much, given how the 460 was
regarded when it launched as compared to the 8800GT, 3870 & others it
typically replaced. My first two 460s were 174 UKP each (top-end EVGA
FTW 850MHz models) and that was regarded as expensive at the time (more
typical models were about 150); faster in SLI than a 580, which was the
intent. Until the price drops of recent months, 347 UKP wasn't enough for
even one mid-range card (I see the 780 Ti has finally dropped by quite a
large amount - about time).


> They did great with Eyefinity and keeping the price/performance curve going.
> I just hope their answer doesn't take forever.

Yup, and I hope their response avoids the noise/heat/throttling issues which
happened with the 290s. One could infer they'd probably do something in time
for this year's holiday season.

Ian.

 
Status
Not open for further replies.