Nvidia GeForce GTX 1000 Series (Pascal) MegaThread: FAQ and Resources

Page 99 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


They did get 2.182 MHz, though. That's a "close enough" for me.
 


that's the problem. and for some people $200 video card should be able to max out latest triple A title while maintaining 60fps on 1080p. anything less is not acceptable.
 


Well that's just too bad for them 😀
 
here is a welcome product from EVGA for pascal cards. called the power link http://www.gamersnexus.net/news-pc/2591-new-evga-power-link-at-pax-and-dg-87-case

let's you connect card power cables at the botom of the card by the mobo instead of poking out up top like most ports are situated. looks nice as well. write up says it will support most cards, even non evga ones, that have the power connections in the standard place.

c15ee000a63c6516782ab90c4c87778d_XL.jpg




also another 1060 3 gb review http://hexus.net/tech/reviews/graphics/96412-evga-geforce-gtx-1060-sc-gaming-3gb/ evga sc card. loses about 5 fps to the 6 gb sc model but does run rather warm due to the single fan set-up. about 77 degrees at load which is rather high for such a low powered card.
 
write up suggests internals have taken care of that. but they were also not allowed to open it so can't say for certain. kind of like riser cards, great idea if done right. done. wrong and they are nothing but a mess. evga tends to do things right so i'm hopeful we may see more of these types of niche products.
 


There's no way to perfectly take care of that. Anything that plugs into anything else adds resistance and causes voltage drops. But I'm overreacting anyway, it won't really matter; only in a rare case would it actually affect overclocking.
 
I miss the rear connectors of GPUs. While with smaller cases a rear connector can be a problem for space intruding on the HDD cage, today's higher end GPUs are shorter than older ones, so it's not as big of a deal unless you are doing an mATX build.

I've never been happy with the side power connectors of the last three generations of my GPUs (275, 680, 970). It not only looks ugly through the side window, but it makes cable management more difficult, especially for trying to maximize a good airflow solution.

I wish GPU makers would reconsider rear PCIe connection again. it makes one wonder if they do it for design convenience or purposely for not inhibiting potential sales with depth space concerns for those with smaller cases.
 


I know it's about performance and not looks, but sometimes, when I see a card like this one, I think to myself: "who approved this? were drugs involved?"

I also wonder what is the point of hybrid pascal cards. If I remember correctly, gamers nexus did a DIY hybrid which performed much better than the reference but pretty much on par with higher tier, air cooled, partner cards.
 
i'd expect it to perform just like all the other 1080 cards. but cooler and "maybe" a bit quieter. it should be a custom board rather than the FE pcb used by gamer's nexus, not that it will make a difference for real world fps but at least it's something.
 
we were at the point where reference and board partner cards will perform about the same. board partner probably can offer better cooling or throw in something extra that is not available with reference purchase. some people used to think having those close loop cooler attach on the card will enable you to get crazy OC that is not possible on air.
 
To all nVidia GFE users, a word of caution in case you see performance drops:

Some two days ago, Nvidia released a new version of their Geforce Experience utility. This is a great tool to keep the graphics drivers up to date and to assist beginner players with optimization of game settings. Unfortunately, it seems that in our game's case, Geforce Experience is overly optimistic, using very aggressive values for the "scaling" parameter (up to 300% on mid-range cards), or for shadow quality.

The term "scaling" sounds innocent, but actually it stands for a super-sampling coefficient. Super-sampling is a brute-force method of anti-aliasing, causing the game to render the whole scene at higher resolution than the target screen resolution, and then downscaling it to actual screen size, making the picture sharper in the process. Setting it to a high value is sure to strain even a very powerful 3D card. To illustrate the matter with an example - if you have a full-HD monitor, the screen is covered roughly by 2 million pixels. With the "scaling" set to 200%, you are asking the game to render everything at a higher resolution, rendering 4 million pixes, effectively doubling the graphics's card load.

The Geforce Experience tool is happy to go and overwrite the game's config.cfg variable values, and the user may not even be aware of this change of the game's settings. If you are unhappy with the game performance, we urge you to go into game's Options screen and review the parameters on the graphics settings tab there to adjust the performance vs. looks balance according to your preference.

This is an official patch release note from SCS, a small developer of a couple of games that I love (American and Euro Truck Simulator series, plus others), stating the performance issues where due to GFE assuming stuff incorrectly. So, like I said, if you're experiencing issues, look at what GFE is doing, in case you have it installed.

Cheers!
 


But why would you need to do that in the first place? 😵

I mean, going outside of GFE to disable DSR is... kind of dumb? I know they are different setting intrinsically, but shouldn't GFE be "on top" of that? Like have a second way to disable it inside GFE?

Oh well, I guess if nVidia notices would put such setting inside GFE... Or at least I would expect them to.

Cheers!
 
My experience with GFE is that many times, the suggested settings are just plain wrong. You can almost always do better tuning the game to suit yourself. Also, your fellow gamers, if you belong to a community, are a better resource for what works in a particular game than anything suggested by NVidia.
 


dumb as it might be i prefer for nvidia not to let GFE work on top of NVCP lol.
 



personally i just use the suggested setting as point of reference. then try finding my best own setting. and sometimes GFE manage to find out settings that i never seen while tuning the game graphic options in game.