Nvidia GeForce GTX 970 And 980 Review: Maximum Maxwell

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Agreed. I don't get why NVidia sometimes launches at some insane price Titan/Titan Z
and other times prices things reasonably

I would guess the low price of the 290/290X put some fire under NVidia's butt to keep the price low
 
NVidia has really put a challenge to AMD with the value of the GTX970.

*It's getting very difficult to compare video cards now. It used to mainly be just frame rates but now we have not only the older features like PhysX but new ones with the real-time lighting plugin, VR support, new anti-aliasing feature (MFAA).

AMD has Mantle (arguably not a big deal) and TrueAudio (again, not sure it will matter) but in my opinion they've got a tough road ahead of them.

NVidia's optimizations for efficiency (in part due to MOBILE concerns) is paying off for the desktop to:
a) reduce noise
b) allow higher overclocks
c) reduce or eliminate thermal throttling performance loss

I wish AMD well as competition is great, but long-term I'm concerned that AMD can't compete. The main problem may be a lack of money to support the large investment needed for the driver staff and GPU designs.

They've done really well, but when I look at the history and their financial status it's starting to look like what happened with INTEL and their CPU's. They were neck-and-neck arguably but couldn't keep up.

The GTX970 such as the Asus Strix 0dB version is looking like the best value in a graphics card in a long time (perhaps forever).

(not a "fanboy" but great job NVidia!)
 
Why the hell there ain't a 8GB verison of 980s? I mean 4k monitor will surely lag in future with the 4gb VRAM! its expected! even making 6GB Vram could still make this card a ACE!
 


By the time 8GB is needed, an even stronger GPU core will be out.

It is likely possible to make an 6GB version so I wouldn't be surprised if one comes out later
 

HD 7970 is rated at maximum power 250W
GTX 680 is rated at maximum power 195W
That's 55W difference in power rating.
From the benchmarks at the launch of the GTX 680 it was much stronger on minimum frame rates and less so on average frame rates. Minimum frame rates are much more important to how smooth a game feels.
The HD 7970 was launched months before the GTX 680 so I don't know why the HD 7970 drivers would not be mature by the time the GTX 680 was launched.
 
Further to my previous post and kind of in reply VincentP
I like taking screenshots, kind of photos in the fantasy world of Skyrim. So outdoors I want all the eye candy turned on to take a photo which is when it slows down to 11 fps in some spots.
I have a i5 4570k, AMD 6870 1GB, 8GB mem, samsung 840 SSD. I currently only have a 1GB card, though with ENB boost it means it doesnt crash if I can actually load the particular save game to start with !
I have ELFX rather than realistic lighting overall, and I dont think Flora overhaul would replace a good grass mod, just compliment it. The ENB I use does tend to look better indoors than outdoors, but outdoors is where it slows down the most, I do use the ENB depth of field too . So I would like to see some benchmarks with Skyrim, and ENB and a high density grass mod. Heck I dont even have any high def textures, apart from my character and it still brings it to a crawl ! I know its 3 years old but a dx9 game is the most demanding game that you could possibly throw at this card ! Anyone have any outdoor fps with skyrim plus a demanding ENB plus grass with this card ?
 


You are starved for VRAM. With a 3/4GB card you would no doubt be getting much better performance.

And a GTX970 has a lot faster core too. I would be surprised if you got less than 30 FPS
 


I suspect you may be correct, as the cards do appear to be bottlenecked by some kind of memory to graphics card bandwith bottleneck, i.e. I dont think the card is running at 100 percent. Though Im not 100 percent sure and I would like to see some kind of real world experience. To put it in perspective the only benchmark I can find with skyrim and enb (http://www.computerbase.de/2012-09/test-nvidia-geforce-gtx-650/20/) gives a 770 equivalent card (an 7970) a 2 times boost compared to a 6870.
 
MajorTom6000,

You might be interested in the Skyrim pics thread on overclock.net, there's often a fair amount of discussion
about performance issues, ENBs, VRAM, etc. See:

http://www.overclock.net/t/1165090/your-best-skyrim-awesome-pictures/11600_100

That points to more recent posts, but jump back a page or two and you'll find VRAM-related
comments about 7970s, GTX 780s, Titans, older cards too, all sorts of issues. And of course the
submitted pics are worth checking out (gionight is my favourite contributer).

Ian.

 
mapesdhs, thanks for that link . With all the searching I have done I hadnt come accross that thread before.
Whilst it hasnt 100 percent immediately answered my question, at 1170 pages there is definitely a lot of it to look through ! And some very nice shots in there too!
 
It is a huge thread indeed. 😀 Naturally, the very early posts refer to somewhat older tech (and presumably
people will be referring to Skyrim topics which have changed since then, new versions of sw, mods, etc.), but
skip forward to maybe half way through, or two thirds, and from then on it's well worth reading, especially as
the pics get better & better.

Or if you're incredibly lazy (like me!), I've been grabbing my personal favourites; I could upload them to my
site as one big zip (PM me), though the down side to that is they wouldn't be in any kind of order, or show
who made what. I keep them more as a way of showing people what games can be like these days, those
new to gaming, family, etc.

Btw, because updates from the thread come frequently, unless I'm posting about something & expecting
replies, I usually wait until there's 100 new posts pending, then I read the thread.

Ian.

PS. I pinched one of gionight's pics for my OCN PC avatar (he does amazing stuff):

http://cdn.overclock.net/a/a5/a5bc6779_1387697.jpeg

PPS. One of the people posting pics on that thread is using a quad-Titan setup...


 


Ive already saved a few pics myself :) Though it will take me a while to look through. Ive been addicted to the nexus skyrim images lately (http://www.nexusmods.com/skyrim/images/view/?)
Anyway thanks again for that and Im off to bed as it 2 in the morning here !

P.s. Quad Titan Ouch ! Hope he gets better than 10fps !
 
The 980 certainly is interesting. But paying again that much for new cards is something I can't afford. I recently upgraded to a 780 SLI, and I really hope that these will suffice for a while now. (And for The Witcher 3)
What tempts me most of the new cards is the lower power consumption.

I have now intention of playing in 4k resolution, 1080p is enough for me, though I use downsampling in some games.

Afaik Nvidia said to bring some of the new features to older cards too. I really hope the 780 isn't too old for that.
Also I am afraid that Microsoft and Nvidia team up and make DX12 reliant on W9 and the 9xx-series.....
 

It isn't a matter of teaming up. Hardware support is required for new versions of DirectX. The GTX 780 was made to support DirectX 11.2 and it will never support anything newer.
As for Microsoft, they have decided not to support DirectX 12 in Windows 7 but later releases will support it.
DirectX 11.3 will be released for Windows 7 supporting many of the same features, but again not supported by the GTX 780.
 
Except you still need 2x 970 to play at any decent resolution above 60fps.....they are affordable but I don't think I will upgrade from my 2x 770 SC ACX yet, maybe next year!!
 
I think direct x 12 will work on the 700 series cards just like it will on the 900 series. I don't see them making it only 900 series since the 700 series is still so new. I mean the 780ti/kingpins/Titan black and z are still so recent cutting them out to only support the stuff that just dropped a few days ago would cause some huge customer issues. I just don't see it happening and saying the WONT as if someone knows for sure is not good gossip to be spreading. Unless you work for nvidia or Microsoft you do not know for sure and you will just have to wait and see like the rest of us.
 


Like anyone else on these forums I can only speak from experience.
Typically new DirectX version features like new shader models require hardware implementation.
You can't just add this to existing cards, so if these were made before the standard was finalised you couldn't blame Nvidia for not supporting them.

I can see from your link that they are going to support it, so I guess the hardware support is there going back to Fermi. This means the majority of GeForce 400 series cards onwards will support it. That is impressive.
 
Like i said before making such bold statements always have something to back it up with. No harm no foul there is just alot of things said that have no merrit here on toms so the more evidence you have to back up larger statements like that the better.
 
Wouldn't mind a pair of GTX 970's in SLI. Don't need it, but it would only be 40w more than what my system currently is at with a single HD 7970. :lol: I am looking forward to the midrange cards, for upgrades to my HD 5850's in my other systems. Blizzard is saying that WoW will be more GPU dependant, for Warlords, than it traditionally has been. My HD 5850's are probably going to struggle at the same settings used currently, once Warlords comes out. Nvidia keeps the price/performance/watt trend, for Maxwell, I will be making the move to Nvidia in my spare rigs.
 
Status
Not open for further replies.