Nvidia GeForce GTX 970 And 980 Review: Maximum Maxwell

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Can you guys stop cutting out the 4k ultra benchmarks? Ok Ok unplayable yadda yaddq. But give us the frikkin data! sub 30 fps is far from unplayble, 20 fps is prefectly playable. Not ideal, of course, but playable. 10 fps is still playable, not great, but its still playable. Unplayable is sub 10 fps(and even then ive played games at 8 average fps over the years, and it never made me stop playing).

Really show us what the cards can do rather then just slapping a sub 30 fps unplayable label on it, and not even showing us the data you collected.

Ultra graphics is the only setting i ever use in a game. And im tired of benchmarking sites not showing what new cards can do in high resolution ultra presets.

The only 2 graphics settings i care about in benchmarks are 1920x1080 with all graphics options set to ultra. And 3840x2160 with all graphics settings set to ultra. Normal settings are a waste of time, for me. I want to know what its a viable option(for me) to go to 4k, kinda hard when no one shows you the benchmarks...

Im sure im not alone in wanting 4k ultra benchmarks for all the cards.
 
I don't even card about the 980 or any nvidia cards there to expensive I can't understand why the 780 have no price drop.there crazy if they think I'm going to be a sucker to this.
 


These cards are JUST BARELY coming out. The prices on the 7xx cards will come down, but expecting it to do so immediately is a tad bit unrealistic. NewEgg has already pulled several from their store, including the MSI 780 Lightning I had previously considered buying.

 
I'm amused at some of the AMD bashing. Firstly, as regards price, NVIDIA were forced to lower prices when AMD came out with the R9 290 and 290X at significantly lower price points (let's ignore the crypto-related prices increases as that was related to US retailers and not AMD gouging its customers); at this point they don't need a large cut in prices to stay competitive, and besides which they do tend to have better incentives.

Secondly, everybody wrote AMD off following the HD2000 series, and look what the HD4000 series did for them. It'd be foolish to think they aren't developing something to compete, and as evidenced from the power consumption results for Maxwell, AMD's cards just don't react as quickly, which - combined with a few optimisations here and there - could close the efficiency gap again.

That said, NVIDIA have done very well here and deserve the praise.
 
Sub 30fps is garbage and unplayable in today's games. The games are choppy, slow and if you are playing online at 30fps you are probably the scrub I'm tea bagging 40 times a round in bf4. I can't understand how anyone says 30 fps or lower is playable. Maybe I'm just a spoiled guy with my always having 55+fps minimum but either way 30 fps is trash.
 


30 FPS is playable. Not pleasant but playable.

If it has stutters or framedrops it pretty much spoils it.
 
I have the 280x paid 300 $ got Battle 4 free so 250$ was a good price, and it wants 750W P.S. and 2 8 pins the specs. said 550 and 1-6 1-8. NOW i can really enjoy my 650 P.S. with this E V G A. WOW 350 $ ..3 weeks ago A M D was bragging on their cards and NOT lowering the Prices. Eat CROW
 
The whole point in making awesome graphic games and having these high end cards is too Give the best gaming experience possible and that can not be done at 30fps and below. The game just isn't enjoyable. Sure 5 fps is playable....... Is it something someone wants to do or would do? No of course not. Sub 50 fps these days isn't even excepted by most gamers which is why most tests don't show charts that show 30 and below fps. Because it isn't acceptable anymore. Original duke nukem or doom or wolfenstein was ok at sub 30fps though why don't we go ahead and show those charts at 4k too. Lol duke nukem at 4k would be kinda cool.
 
Arguably 5 FPS isn't playable. When it takes 200mS-400mS to even acknowledge the input

Playability is very subjective. Keep in mind many people cannot afford top of the line hardware.

To me the threshold is around 26-30
 
I made the move from two 780s in SLI to a single 980 card running on the PG278Q 1440p G-sync monitor. Actually, once I went to the G-sync monitor, I didn't need to seek out the super high frame rates to get good performance any longer. With the GTX 980, I don't need two cards to stay at the performance sweet spot (usually the frame rates are well above) for G-sync any longer with any title. That's pretty substantial. This means a lot for the mini-ITX and small build crowd too. There is just a lot of performance in a single 980. It's really impressive.
 
There is a lot of talk about the GTX 980 Ti, "big maxwell" or "real maxwell" in the comments but I'm not sure we'll be seeing this any time soon.
The GTX 770 (GK104) chip has 3.54 billion transistors and a die size of 294mm2.
The GTX 970 and 980 chip (GM104) has 5.2 billion transistors and a die size of 398mm2.
Obviously there is a massive performance improvement, but because they are still on a 28nm node the die has had to grow to make room for the extra transistors required.

The chip used in the GTX 780 and GTX 780 Ti (GK110) has 7.1 billion transistors and a die size of 533mm2.
I think this is the biggest chip ever released in a gaming card.
It seems to me that to make a maxwell chip with a significant performance increase over the GTX 980, Nvidia will have to use a smaller node.
The rumors suggest the 20nm node has had problems and will likely be skipped all together.
If Nvidia go with the TSMC 16nm node, these won't be shipping until at least this time next year:
http://www.tomshardware.com/news/tsmc-apple-nvidia-denver-finfet,27538.html

On a new node, I could easily see this being the next generation of GPUs (e.g. GTX 10XX).
Probably the most likely way we might see a GTX 980 Ti is if the GM204 is not fully unlocked in the GTX 980.
 
I am very tempted by the GTX 970, but I would love to see some benchmarks with Skyrim with an ENB ( I use phinix enb) and a grass mod like dat grass or unbelievable grass ( which I have). I only have the low res grass (1k), but I have an intel 4570k and a 1GB AMD 6870 card, and it does about 10 - 12 fps in some outdoor bits. I would imagine this card might double it to 20 fps ( 1920 x 1080 ).
This appears to be a good benchmark test, and I would like my Skyrim to run a bit faster than 10 fps! Anyone tried Skyrim with a good ENB and a grass mod ( plus a few other mods) outdoors ? What fps are you getting with your card?
 


You can use Skyrim Flora Overhaul Basic and Realistic Lighting Overhaul mods without crippling your performance like this. Make sure you disable MSAA if running on a 2GB card with HD texture mods.
 


This gets revised every month.
Pricing on the different cards wouldn't be clear now anyway as these will change over the next two weeks.
 


Honestly this is probably one of the dumbest comparisons I have ever heard of.

-The 680 was only slightly stronger, and only for a few months considering once the 7970's drivers matured the 7970 firmly beat the 680.
-Sure the 680 costs less, but the 7970 also had 50% more VRAM
-The 680 only consumed about 10w less. Why does everyone act like it was crazy more efficient? Yeah, a slightly weaker card with far less memory uses slightly less power. Want a cookie?

Meanwhile:
-The 980 is ~20% stronger than the 290X.
-The 290X costs less (As it should)
-The 980 consumes 100w less. THIS IS MASSIVE.
-The 290X is a year old. Good lord I hope a card that is a year newer is this much better.

The two cases could not be more different.
 
god job Nvidia, but nvidia fanboys, relax, these new cards make obsolete whatever nvidia and AMD has untill this moment. Really Titan Z? lmao, was launched a few months ago... such a joke card, and there where still people who buyed those cards (and please stop the BS about " its a profesional card")
 
what is causing this "platform bottleneck" you speak of in so many of these benchmarks?

Is it because 7GHZ GDDR5 is at it's limit, or is it CPU x97, and DDR3 limited?
 


If the limit were in the GDDR5, that would a GPU limitation not a platform bottleneck.
What they are saying is that no matter how fast the GPU is you won't get a higher frame rate because it is being limited by something else in the platform. This can be observed from the benchmark results without knowing exactly what the bottleneck is.
 
Good review but must say I'm getting more and more disappointed by Tom's decision to stop including Crysis 3 benchmarks. Hopefully we'll see an updated 4k gaming review with 2 or 3 of these in SLI vs 2 or 3 780ti's etc and PLEASE PLEASE PLEASE include Crysis 3 in that write up.
 

That's funny, because Anandtech used the exact same analogy in their conclusion.

Quote:
"In many ways it feels like this latest launch has returned us to the PC video card industry of 2012. NVIDIA’s flagship consumer card is once again powered by a smaller and more potent consumer-class x04 GPU, and once again NVIDIA is swinging the one-two punch of performance and power efficiency. When GTX 680 was launched it set a new high mark for the video card industry, and now we see GTX 980 do more of the same. The GTX 980 is faster, less power hungry, and quieter than the Radeon R9 290X, so once again NVIDIA has landed the technical trifecta. Even if we’re just looking at performance and pricing the GTX 980 is the undisputed holder of the single-GPU performance crown, besting everything else AMD and NVIDIA have to offer, and offering it at a price that while no means a steal is more than reasonable given NVIDIA’s technical and performance advantage. As such GTX 980 comes very, very close to doing to Radeon R9 290X what GTX 680 did to Radeon HD 7970 over 2 years ago."
http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/23
 
Status
Not open for further replies.