Nvidia GeForce GTX 970 And 980 Review: Maximum Maxwell

Page 12 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Well the maxwell looks nice and I may be after 2 980's lol. Already got my cards on the eBay haha. The price difference in the 970's and 980's while rather large I still think once overclocked the 980's will be more worth their price tag.
 

mapesdhs

Distinguished


Some reviews have a few numbers, but beyond that it's hard to find decent info.

I might get a 980 soon, in which case I'll run some AE tests before putting it into my gaming PC.

Ian.

 


Please Do , I found your reviews very handy . Would also benefit if there are some videos.
 

Isaiah4110

Distinguished
Jan 12, 2012
603
0
19,010


I'm guessing these cards (or at least the 980) have the potential to out-"CUDA" the Titan. I'm looking for a card to specifically handle Adobe Premier tasks (all CUDA and no OpenCL option from what I can tell) and zero gaming, so these cards could be just what the doctor ordered. I just want to see some numbers. So if you have the ability to do that then I'd obviously much appreciate it.
 

mapesdhs

Distinguished


Revews?... Are you perhaps confusing me with someone else? I don't think I've done any reviews yet.
Oh, what would the videos be of?


Isaiah4110, I think I'll buy a 980 early next week, get some tests done. I use the following test scene
for benching CUDA:

http://www.sgidepot.co.uk/misc/cuda.101.zip

ie. rendering just Frame 96. Don't try this on an average system, even a Titan will take about 25 mins
to crunch one frame. The full 4 second sequence is for stress testing. My 3930K/Quad-580-3GB does it
in about 15 mins. Finished frame looks like this (change suffix to bmp/tga/rgb for original full quality):

http://www.sgidepot.co.uk/misc/cuda.101_Frame96.jpg

Ian.

 

Isaiah4110

Distinguished
Jan 12, 2012
603
0
19,010


Awesome. Having a rough idea of Titan's (or any other higher-end card) performance on that bench really helps a lot. Otherwise there is no point of reference right?

Thanks mapesdhs
 

Casecutter

Distinguished
Jan 15, 2010
23
0
18,510
Once again I like the detail you put forth in drilling-down on actual power usage with this "New Power Consumption Test Setup". Let me also say what the OEM delineates as the "rated at maximum power" is not indicative of the power efficiency.

It's interesting this time (not like in the R9 285 review) Nvidia/Gigabyte came right back saying, "oh no you can't replicate our reference claims because of the cards power target". I'm also interested more in this "ability to adjust to changing loads" during gaming. How such a power modulation might be better replicated by the result of "mean" power when collected over more gaming titles (not sure if it's one or every title you B-M). Statically plotting more data points the "average" appears improved/worsened as you said in the review, "The more variance there is, the better Maxwell fares" With the "mean" calculation it provides a broader swath, normally more indicative of total power consumed.

I really would like to see this new test setup to acquire data from more than one game, while run either all reference designs or if the manufacture never delivered a reference use similarly version of OC custom with like percentage of OC. This would even the playing field and give a more open picture of what folks really end up in the real world purchasing.

While I acquiesce that Maxwell is saving power. Although is it all because of micro-architecture improvements within the layout, or more the ability to granularly effect and utilize groups shaders to full potential only for that millisecond? Or another way to express it... is Maxwell more an evolution of dynamic overclock, less to do with rudimentary clock speed frequency changes, and more delivering the power based on specific load per clock cycle?
 


By reviews , I meant your tests . ( the i5 7xx with the 7950 and 7970 ) .

I am sure its you Ian.

The video would be explaining what you did in the progress and what will hinder performance with older cpus and in that case some reviews of the new card.

 

mapesdhs

Distinguished
Isaiah4110,

I hasten to add that my estimate is based on a guy posting on a different thread some results for his own
Titan-based system. Tried to find the thread just now but couldn't locate it. Anyway, point being, I've not
tested a Titan myself yet.

Oh, forgot to mention, the linked screenshot is also there as a targa file (change .jpg to .tga).

Now then, question is, which 980 to get... :)

Btw, I've sold one of my reference 3GB 580s to a movie company for 175 UKP. As I've posted so often,
580s retain significant value due to their comparatively good CUDA performance (the company will be
using it for general CUDA development).


TopLuca, oh I see! :D Yes indeed that's me, though not exactly a review, hehe. Oh, I've done the
3DMark tests, will be adding the data to that thread later. Some surprising results, especially in CF...

Re the video, it'll probably be easier just to include that sort of info as text. Funny you should mention
CPU performance as regards CUDA; one of the AE tests I want to do is on an old P55 system with
PCIe switches (ASUS P7P55 WS Supercomputer), an i7 870 (old by today's standards), 16GB RAM,
various GPU combinations, see how AE/CUDA tests compare to X79 monster systems, etc. The board
can use 4-way all at x8. I can try the tests with four 8800GTs, four 9800GTs, three GTX 280s, four
GTX 580s, and of course a 980 when I get one. Should be a giggle. Finding the time is always an issue
though. Never enough hours in the day. I'll finished the i5/7970 stuff first.

Ian.

 

MasterMace

Distinguished
Oct 12, 2010
1,151
0
19,460


I remember that launch too. The 580 was released in November 2010, to go up against the Radeon 6970 released in October 2010.

The 7970 wasn't launched until November 2011. It went up against the GTX 680, which was launched in March 2012. Almost like you're comparing things from different generations.


As for the original quote, the 7970 GE was famous for taking 40-70W more than the GTX 680 to perform worse than it in many titles while sounding like a vacuum cleaner in your case and frying an egg, hence requiring the non-reference cooler. The 7970 could be overclocked well, at that cost. This heat and noise issue has dogged AMD for several generations.
 

meat_loaf

Distinguished
Oct 20, 2011
650
0
19,360
I don't know why people in here are bitching about and comparing R9 290X vs 980 since thats what I'm seeing in this forum and alot of forums. There is absolutely no point in comparing that when two cards in different generation one being newer and the other being older. AMD and Nvidia always trade blows. R9 290 came first and then 780 to beat it and then R9 290X and then 780TI. Then nvidia gives its consumer the junk Titan cards which AMD gives R9 295x2.

AMD will be releasing its R9 300 series and we will have to wait until beginning of next year to compare the newer gen cards since thats what AMD is aiming to defeat the newer Maxwell. And then Nvidia will come back with 980TI. So at this moment, all these comparisons needs to stop. Its making people look silly and retarded at the same time.


 


The titan was released before the GTX 780.
The R9 290 and 290X were not released for another 5 months after this.
Nvidia released the GTX 780 Ti about 2 weeks after the R9 290 and 290X were released.
The dual GPU cards (R9 295X2 and Titan Z) were not released until early this year.
It would be fair to say that Nvidia beat AMD to market by 5 months on this generation of cards, and already had a response ready to AMDs to top performing card when it finally came to market.

It would make more sense to look at the release of a generation of cards rather than individual cards.
HD 4000 series Jun 2008
GeForce 200 series Jun 2008
HD 5000 series Sep 2009
GeForce 400 series Apr 2010
HD 6000 series Oct 2010
GeForce 500 series Nov 2010
HD 7000 series Dec 2011
GeForce 600 series Mar 2012
GeForce 700 series May 2013
Rx 200 series Oct 2013

Cards are always compared to the most recent cards available from the competition.
Back in the days of the HD 5000 and GeForce 400 series cards, ATI/AMD had a big advantage over Nvidia for efficiency. The HD 5850 was faster than the GTX 470 while using less power and generating less heat.
The GeForce 600 series took that away from AMD and they are yet to claim it back.
 
The titan was released before the GTX 780.
Keep in mind that it was so far overpriced there was little adoption.
Basically what NVidia did with the 780ti was brought the prices down to reasonable levels once there was some competition

Anyway price means a lot more to me than power consumption. Although this time around NVidia got the price/performance crown too
They KNEW they needed to price it reasonably
 


The titan was never really a gaming card. On gaming cards they cripple double precision arithmetic so as not to compete with workstation cards. The titan was a fully enabled card suitable for other applications, and it had the additional VRAM to support this too. The very high price was also due to having no competition at this level. The GTX 780 was a cut down version of this card meant for gaming, and it sold at a top tier gaming card price like the GTX 980 that replaces it and the GTX 680 which came before it. The GTX 780 Ti was a more capable card that they released as required when AMD was able to offer a product that could compete with the GTX 780. They have priced their top tier gaming card at pretty much the same price for a number of years (GTX 680, GTX 780, GTX 780 Ti, GTX 980). I'm not sure on GTX 580 prices and earlier, but probably in the same ball park on release.
 

mapesdhs

Distinguished
meat_loaf writes:
> ... There is absolutely no point in comparing that when two cards in different generation one
> being newer and the other being older. ...

On the contrary, comparing whatever one is able to buy at the present time is perfectly logical and sensible.


> Then nvidia gives its consumer the junk Titan cards which AMD gives R9 295x2.

Wow, so you're not biased then. :D


> AMD will be releasing its R9 300 series and we will have to wait until beginning of next year
> to compare the newer gen cards since thats what AMD is aiming to defeat the newer Maxwell. ...

That's a contradiction, as by then the time scales will already be months apart. The 980 is
available now. Suggesting one not compare it to AMD options available now is just stupid.


> ... Its making people look silly and retarded at the same time.

Nope, that's how you make yourself appear saying people shouldn't compare products
available to buy from either side at any one time. Naturally one can advise and say,
hey, hold if you can, AMD's next thing is around the corner, but that means nothing
to anyone who wants to buy a card right now.

And here was me thinking AMD fans generally tried to take the higher moral ground.
I guess not. :}

And yes I'm being provocative, as what you've suggested people do is just daft IMO.

Ian.

 
Moderator hat on....
Healthy debate is great, but lets please keep the name calling out of it as this will often escalate. If you have a point to make, support it with facts or reasoning rather than simply saying everyone that doesn't share your point of view has no argument.

Consumer hat on...
Anything that could potentially advance graphics capabilities on my PC has my vote and is good for competition. As of right now, from a gaming standpoint, the 980 is a great card and paired with the PG278Q G-sync monitor is what I've been waiting for for years. As dynamic sync technologies become increasingly mainstream, they are going to likely change each of our requirements in a GPU. Numbers aside, G-sync has had more of an impact on my gaming performance than the jump from the 700 to the 900 series card.

I can't wait to see what free-sync monitors hitting the market is going to do for competition. About that time, a lot more G-sync monitors will be available as well. I really do hope the industry settles on a single standard as it would be nice to not have to pair GPUs with a particular monitor's sync tech. Why can't a GPU just send frames and the monitor just display them? We're no longer dealing with tubes and a need for static refresh rates but when everything switched from analog to digital tech static refresh rates stuck.
 


I didn't. I responded to a previous post, which was quoted in my post.
I would have no reason to mention the release date of any of these individual cards other than to correct the mis-information in that post.
 

mapesdhs

Distinguished


It's a pity though that outside the US the 980 is being price hiked rather a lot. Compared to the RRP $US
price mentioned in the review, in the Uk it's typically about a 3rd more. Ouch...

Now I'm thinking I might wait for the 980 Ti, or at least for prices to settle somewhat.

Ian.

 

Cinerir

Honorable
May 21, 2014
34
0
10,540


And what about that: http://www.tomshardware.com/news/microsoft-dx12-directx-12,26360.html ?

Nvidia says it will support the DX12 API on all the DX11-class GPUs shipped so far, including those belonging to the Fermi, Kepler and Maxwell architectural families. Though DX12 will compete with AMD’s own Mantle API, AMD was at Microsoft’s session and Raja Kadouri did confirm DirectX 12 support for AMD’s GCN hardware.
 
Status
Not open for further replies.