Nvidia GeForce GTX 1000 Series (Pascal) MegaThread: FAQ and Resources

Page 47 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


yes it is a gpu but more than likely it is not as good as the i5 igp. it is an old AGP card and probably very low performing compared to today's cards and even integrated graphics. can't see any part numbers or anything from the pic but it appears to be an old ATI card of some kind or another.
 


Even if it is better, the AGP BUS is not compatible (in shape or tech) to the PCIe X16, so it wouldn't fit in any i5 Mother Board. Plus, that looks like a Pentium 4, so even if the GPU was better, the CPU is not and would drag down the GPU anyway.

Cheers!
 


No problems maintaining boost clocks with my card on stock settings... I'm getting more than the advertised 1734 mhz when playing graphic intense games such as The Witcher 3, Crysis 3, far cry 4 and many other titles...

Perhaps the new drivers helped..
 


If your ambient temp is relatively low and there's a lot of airflow going around that GTX 1080, then hitting boost clocks should be good. Albet with the fan ramped to more than audible levels.
 


My fan is quiet, The settings are stock however it's a cool day today.

I'm using a good mini itx case so my air flow is not bad but not the best.

A 4k monitor may stress this card out more however 1440p above over 100 fps is also quite punishing...
 


If you need to run 4k or 6k (yeah 6k not kidding. :)) Just go into Nvidia control panel and enable 4.00x resolution on DSR.
 


I rarely use DSR for a several reasons however now I can probably get a solid 60 fps when I decide to Finnish metal gear solid with DSR simulating 4k.

I also play Hearthstone with DSR sometimes however i always need to revert the setting back in Nvidia control panel which is annoying.

I will be very busy in the next couple weeks, I will not have time to test my Rig...
 


IF dx12 delivers then you will not need a single GPU for the max vram. But that's still way in the future.

 


Hopefully...
 


this is a question that is totally dependent on you.

what is your time worth to wait? how soon do you want to be able to enjoy your setup + budget. once you buy something it is already old technology in a sense that their will always be a newer standard. later rather then sooner hopefully.
 
I was really set on getting a 4k monitor at one point and found that 1440p was a perfect middle ground solution and i've never looked back. Just an idea.
 


All you can really do is regularly check your nearby retailer or the online merchant of your choice. They should be available anytime now and some have already been spotted at newegg's and such. Be quick though because the first batch will disperse quickly.
 
What's come as a shock to me is how quickly 4GB vRAM has become outdated.

I bought my first GTX 970 in October 2014, when 4GB was colossal. Then Shadow of Mordor came out needing 6GB to max HD settings. (You could run it on 4GB cards though. Maybe you'd suffer frame rates with the higher settings.)

Doom now has the same. 6GB needed to max out the settings. I tried it last night with HD textures and shadows. (4GB 980 and the launch options to allows HD settings.) I had to turn off shadows, to maintain 60fps. However I didn't get to the places where frame rate suffers worse yet.

(N.B. Using FXAA(1TX) seriously tidied up the picture.)

Now we either need the 980Ti, or the 1080 to play even titles that are a year old. I mean it always was the case that we needed the top card to play the latest games max. However what it also means is that within a year maybe, there will be a title that one 1080 will not max. Especially for people wanting to play in 1440p. (E.g The Witcher 3.)

Notably Doom on my 22" monitor was screaming to be played on a 24" at least. If you're gonna go 24" might as well go 1440p right.

Lucky for me though is I won't pay full dollar for most new titles. However good prices on games can be found. I bought Shadow of Mordor GOTY for about £10. Therefor I want the best hardware to play it.
 
So I SLI'd my two 1080s yesterday using two regular sli ribbon cables. Not sure what to make of the results. In far cry 4 it was almost flawless. One weird hiccup where it appeared to freeze completely for a solid 5 seconds. But the more concerning part to me was how much stutter, and what almost appeared to be frame drops or frame rubber banding, was happening during heaven and rise of the tomb raider benchmarks. The fps was extremely impressive, but the stutters/drops made it painful to play on.

Not sure if I should attribute these symptoms to infant 1080 drivers, using the wrong SLI bridge, or these games and benchmarks not playing as nicely with SLI as I thought they did.
 
I know FC4 didn't like SLI at all when it launched, interesting to here they worked it out.

24" for 1440p actually sounds kinda small, I'd say you'd want between 27-30" on that.

The other problem is most games (with the exclusion of Overwatch) are released in a very unoptimized state. so it doesn't necessarily matter how good your GPU is.
 


At least only 2% out of the 10% of multi GPU users will be mad at this, so I'm good with that. 😀
 
Only using one has been the case for forever, that's true. Until now. The HB bridge for 1080s, it seems, is nothing too much more complicated than two bridges being plugged in at the same time.