Nvidia GeForce GTX 960: Maxwell In The Middle

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
The GTX 960 would be a much more interesting card if there was a model with a 256-bit memory interface. I suspect that single change would translate to a huge improvement in performance.
 


The GTX 960ti will probably come with the 256 bit memory interface.
 
Fantastic work on the Noise Level and Frequency analysis page. As a mainstream 1080p user, this is the the best page on the whole internet to make an informed purchase decision.
 

Because this is not mainstream pricing - at least not at the time of the GTX960's launch.

If you go on NewEgg.com, search for R9-280 and sort by price, there were only four entries out of 17 below $200 (one of them has two sellers), three more at $200 and the rest are over $215.

You have to compare regular street prices with regular street prices. Not pricing anomalies and discounts. If you start comparing whatever price with whatever other price, that just creates endless pointless arguments. Right now, $210 is a more typical price for the R9-280 - the price point where you can get almost any model you want instead of shopping solely on what is cheapest after discounts that change on a daily basis.
 
For $249 we can have overclocked R9-290. Tha same as overclocked gtx960 but 40% faster. And we can have maxed out games settings up to 1440p.
 
Hmm seems more like a x50 type card then a x60 but maybe nVidia is just realigning their product spectrum. For those complaining, this MSRPs at $200 USD which is pretty low for a x60 series card and about right for it's performance numbers. It has a lower power window, lower cost and thus also has lower performance which puts it solidly in the midrange product bracket. I'm expecting a 960ti eventually with a 192-bit memory interface priced at $225~250 MSRP.
 
I always find it annoying GPU reviews only compare CURRENT generation cards. I'm sure I'm not alone in this, but how many people on CURRENT generation GPUs are even looking to upgrade? Wouldn't it make sense to stack these things against at least some previous gen boards (e.g. HD7950 etc) to give those on older GPUS an idea where they fit in on the performance curve?
 
Please add a quadro or two to the CAD benchmarks. Just to show if those cards make any sense in DX or even in OGL work environment.

With that said, thank you for including CAD benchmarks at all! If anything it clearly shows that AMD is still slacking for no good reason and that 960 is hardly a noticeable improvement over 760/660.
 
The calls for a 4GB version... This card is a wee bit under-powered, much more so than the GTX 760 4GB is/was. They may produce it (4GB variant), but it doesn't make much sense, given the hobbled specs of the 960.
 
Disappointing. I bought a GTX 670 almost two years ago for around the same price the GTX 960 is selling for now. Seems like they perform about the same. The new GPU only needs 120W vs 170W for the GTX 670, but that's hardly enough to justify a side-grade.
 
Have you contacted ASUS about power draw from MoBo PCIe slot? Have you measured other strix 9xx cards and did they behave like this as well?
I wonder if it's faulty specimen, or faulty PCB design and if it's fixable with BIOS update.
Stressing MoBo like this seems pretty bad (especially compared to graphs for other 960s).
 


I sent all raw data to Asus one week ago but I got no answer yet. I was not able measure other Strix cards because I got only one sample and only in rotation (typical problem in Germany) and had to send it after measuring to the next media without the permission to disassemble the card. Funny? One if my close friends in Asia also measured the Strix pn a similar system and got lower performance and less power consumption. His logged boost clock rate under the same load conditions was lower as here. Golden Sample or simply luck with the Chip (over 80% ASIC)? And: Asus is using the reference PCB. The spikes might be the follow of a damaged cap or simply a modified VR. I will proof this. 😉

 


I have some limited data on my site, still doing tests when I can. Just added some 980 numbers. Problem now
is finding newer tests I can run (companies don't seem to release standalone benchmarks anymore).




Check my Viewperf data. In general, don't use consumer cards for pro apps unless visual quality doesn't matter
and/or the app you're using doesn't use OGL. Both AMD and NVIDIA recently changed their drivers (at the same
time as app vendors changed their codes) to make consumer GPUs a lot more friendly to pro users wrt performance,
but it's at the expense of visual quality, precision, etc. Bit misleading IMO, especially for those using pro apps
that are not in the current Viewperf 12 suite. Compare my data for the same GPUs tested for VP 11 vs. 12.

Ian.

PS. The Quick Edit function in the forums isn't working properly. Submitting a correction can show a blank post,
while in reality nothing has been changed.



 
PS. The Quick Edit function in the forums isn't working properly. Submitting a correction can show a blank post, while in reality nothing has been changed.
Thx for reporting. I saw it yesterday and requested a hotfix 😀
 
So... during gaming GTX980 power consumption is between a 570 & 580 & above 680 & 770? 970 is above 560ti?

lol Looks like it is not really low power usage at all. 960 have real power rating... while 970/980 have fake power rating. Nvidia is cheating here...
 


Nvidia isn't cheating. Pefromance/Watt is better than ever. Using less power than a 580, and stomping it into the ground, is a good thing. It stomps my HD 7970 using far less wattage. The same goes for the 970. The performance/watt of the 960 is impressive against the older gen cards as well. It is about 10% faster than a GTX 760, but uses roughly 3/4 of the power to do so. You are taking nitpicking to the next level, or are just and AMD fanboy.
 


Thank you for answer - so we wait for Asus reply. As mentioned in review it probably won't cause (immediate) hardware fail, but I think it could cause stability problems, especially in overclocked systems. Also it would more likely (in time) damage MoBo and not the card itself.
Anyway - when asking about other Strixes I meant measuring other versions of Maxwell: 970/980 Strix. In review ( http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-12.html ) there is only such data for Gigabyte versions, and they are below 75W. I wonder if "bigger" Strix have same problem, but if they are not included in review, they were not measured?
 
Status
Not open for further replies.