Nvidia GeForce GTX 960: Maxwell In The Middle

Page 9 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Depends on what other settings are enabled. The big VRAM eater is AA. Going from 1080p to 2160p (4K) is an 400% increase in resolution but will only consume something like 30~40% more VRAM with ~50% more being worst case scenario. Going from no AA to 8X MSAA can easily double VRAM utilization if not more. This is because every texture needs to stored with an insane amount of additional data to facilitate rendering. FXAA/MLAA on the other hand don't increase VRAM requirements by much because they are post-processing techniques done without needing all that extra data stored. Size of textures also determines how much absolute VRAM is needed, remember games will often try to fill all available VRAM with textures even if they aren't immediately needed on the off chance they ~might~ be needed sometime soon, so be careful with "VRAM utilization" numbers reported.

http://www.tomshardware.com/reviews/graphics-card-myths,3694-5.html

Finally, resolution primary determines framebuffer size which your system often keeps two of. To give an idea of how small these are relative to graphics memory.

1920x1080x32bpp = 8,294,400 bytes or 7.9MB.
2560x1440x32bpp = 14,745,600 bytes or 14MB
3840x2160x32bpp = 33177600 bytes or 31.6MB

So a 4K resolution would only require 32MB for each framebuffer with there being two (3 in OpenGL triple buffering).
 
Well, I didn't really have anything else to spend the extra $100 on, unless you want to spring for an 8GB model. I mean, you might get lucky and find a 780 for about $400, but for 1440 and $400, you're looking at a 970 or a 290X most likely. Personally I'd rather have the 512-bit bus for such high resolutions.
 


Well I am already having to run the fans off a motherboard header, on my 7970, since the fans will not engage unless the card is near overheating. Tried multiple PSU's, and even a totally different system, and same issue. GTS 450 is pretty old, and while it is ok still for WoW, it kinda struggles, at times, in D3. The system it is in now, is mostly used for D3 lately.



I would just get a single GTX 970. Only way I would run 2x 960's is if I bought one, and then bought another later, because I needed a little extra performance.
 
Still on a 760, got an i7 4790k and get more fps on ultra in arma III than this article says the 760 gets, same with BF4, I get great fps. The 760 still holds out really well in the latest games, no need to upgrade from one to a 960 imo.
 
760 is still a beast. Saw a 960 and 970 in Best Buy today, came very close to buying the 970 but figured I'd hold on to my 760 a while longer. Kepler all the way!

EDIT: Gave into the pressure for my birthday. D:
 
Maybe I'm the only one that spotted this maybe... --> The article on page 3 Test systems and Benchmarks states the following,

"Despite that massive number of pixels, 4K displays are becoming more popular every day thanks to sub-$600 options like Asus' PB298Q:"

The picture shown next aint a PB298Q also and more importantly the PB298Q is a 2K Monitor with 21:9 2560x1080p. Not 4K at sub 600$ 2K at a sub allot less.

this is a PB298Q see below -->

http://www.amazon.co.uk/PB298Q-Widescreen-Multimedia-Monitor-2560x1080/dp/B00EKJHG54/ref=sr_1_1?ie=UTF8&qid=1425495779&sr=8-1&keywords=PB298Q

:)
 
what I don't get is the things like this H.265 ?? so I spend 550$ on there so called top end card that's now looks to be shorted in this area ??? but covered under there lower end 200$ card ?? and all I got to look for is a voodoo/ hybrid software workaround ??

you would think for the as of today the 980 at 550$ should be fully functional and see these short comings ???
so what other hidden not or kinda supported stuff will I find the mac daddy card loose over a lesser card ???

with all these little tid-bits you find from these 900 cards seems like a buyer beware or you get what you pay for [and sometimes a lot less]
 
You don't want hardware h265 encoding on the GPU. No regular program where you'd want to use it, supports it and the software that does support GPU video encoding has lot of visual artifacts, that even when minimized aren't worth the trouble they present over the speed you gain.

Anyway, you're not buying mid/high end GPUs to encode video. Not yet anyway.
 
I teach about video encoding. It's a thing that interests me a lot.

As it is, any and all forms of GPU video encoding are for people who have ipads and want to encode video fast before a trip. Since those people are usually not tech savvy, nor do they have appropriate hardware for this, GPU encoding remains a largely unused feature.

Professional software for video editing like Premiere or After Effects care about video quality far too much to use it. Both Premiere and AE have been using CUDA for years. That should tell you that GPU acceleration can and is great and is used- a lot, but video GPU encoding isn't. It's just not ready for serious use and it hasn't improved noticeably for nearly a decade now.
 
thing is I did not care until I started seeing folks find this out after the fact like this guy and others like him
https://forums.geforce.com/default/topic/809510/what-is-going-on-with-the-980s-hevc-h-265-encoder-and-hardware-acceleration-decoder-for-hevc-/


lets face it for the money and the top end card your jipped and if its on the 960 how had can it be for the 980/970 ???

do you feel he got what he paid for ??
 
That is actually fairly common practice. x60x cards usually come out after x80x cards and while weaker in performance they usually pack new features not available to the more powerful, but older x80x cards.

This tech is nothing to write home about, just something engineers have been toying with. If useful, it will be included in (x+1)80x cards later on.
 
Take GeForce 6600, a 2004 GPU that has a hardware h264 decoder, while a clearly superior, yet older 6800 does not:

http://en.wikipedia.org/wiki/Nvidia_PureVideo#The_first_generation_PureVideo_HD

"Starting with the release of the GeForce 6600, PureVideo added hardware acceleration for VC-1 and H.264 video"

"The GeForce 6600 (NV43) was officially launched on August 12, 2004, several months after the launch of the 6800 Ultra."

The most recent example is the biggest. gtx 750 ti has 980 gtx tech, but is released into gtx 7xx family while clearly having superior/newer architecture compared to gtx 780. At the same time 750 is also clearly inferior in performance.
 
Having hardware h264 decode may seam like a big deal, but when you had the money to get a 6800 series GPU, you were not worried about your computer being capable of decoding h264.

Having that tech on 6600 series GPU made a lot more sense as that card is usually paired with much weaker computers. Even so, the vast majority computers at the time had no problems decoding h264, hence:

"This tech is nothing to write home about, just something engineers have been toying with. If useful, it will be included in (x+1)80x cards later on. "
 


Only if you pair it with an FX 9590. Then it is a true space heater. :lol:
 


Do that FX-9590 and use Dual R9-290x in Crossfirex and break out the smores!

 
Status
Not open for further replies.