Nvidia GeForce GTX 1000 Series (Pascal) MegaThread: FAQ and Resources

Page 103 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Yeah, that "no real difference in performance followed by huge difference in performance" is concerning.

It may be we just don't have enough new games to really test that out with, since ME:C is newer than the other titles by quite a lot.

I still feel the 1060 3gb is a good card if you can sneak it in, but the 1050 still isn't out yet.

 

The 1060 3GB is a good card. I wish more people would realize this.

I believe the reason why everybody thinks the 1060 3GB is a piece of junk is because they think it should be able to play the latest AAA titles maxed out with maxed out textures just fine for only $200. When in reality, this card is more aimed towards non ultra settings like high and slightly older games which TONS of people still play.
 
Also something I already mentioned: lowering some visually insignificant settings get give you great performance boosts. Maxing out sound cool but is not at all necessary to enjoy a game to the fullest, visually.

 


Ah of course. Unfortunately there's always downsides with rumors.
 


this. i have seen few comments where some people saying that "if i buy $200 GPU i want to be able to max out all triple A games that i will be playing". the thing is in this range it is considered as mid range gpu. yes this kind of performance might be equal to high end back in 2014 but just like hardware advancement games also using more and more demanding graphical feature. some even thinking that slower card will be even better choice as long as they have more VRAM. TPU review of 1060 3GB show other wise.
 


The problem is that people insist on running everything on max settings. If they don't, the fact that they're not on max settings crawls in their skin like poison. Low to mid end GPUs should be fine even for 1440p since you could simply play on low to medium settings.

I also believe that the rate at which games get more graphically appealing is a lot slower than the rate at which games get more graphically demanding. I feel like if you take a game 3 years from now, a 2019 game, that requires 2X the GPU horsepower than some game of 2016, it won't look 2X as good. Maybe 1.25X as good. I feel like this is the game industry's trick. This is my own conspiracy, but I feel they want to create the false illusion that graphics are really improving proportionally to the graphic demands of the hardware.
 
I agree with what your saying turkey. And I believe the reason why it's this way is because of realism. More and more GPU horsepower is now being needed to make sure trees bend the right way and everything else that goes in that category.

I've always wondered why people always wanted ultra all the time. When did high start to become not enough?

Speaking of that, I created a entire thread dedicated to this subject: http://www.tomshardware.com/forum/id-3187965/discussion-high-settings.html
 


Very few games will ever look real if they continue in the direction they are heading. For one thing, they need to blur whatever is not in focus. So if the foreground is in focus, the background has to be blurred for realistic effects. The next thing is that all the graphics need to be improved in unison. If you have insane textures with jagged edges, things look weird. Or if you have a super high resolution and full anti-aliasing but textures are just poor, things look weird. I still refer to Mercenaries Playground of Destruction as the most realistic game I have played:
GSAKphE.png


Are textures as good as today? Nope. Is the resolution as good as today? Nope. Anti-aliasing as good, nope. None of that stuff is good but I still feel this game looks more realistic. Reasons being:

1) Lower resolution. Higher resolutions make things look more artificial. Don't believe me? Go watch a movie in 480p and then watch the same movie in 4K. You'll notice more problems in the 4K, you'll be pointing out flaws. In 480p, there are less problems as a whole.
2) Unsaturated colors. Everybody wants all these pretty colors but there is nothing realistic about that. In real life, the majority of colors you encounter are more dull and grey.
3) Unity. Everything works well together as a whole.

The lower the resolution of a video game gets, the harder it is to tell it apart from a movie. The higher the resolution the video game is, it may look sharper as a video game, but it looks less real.
 


I can see your point but only with video games. Movies in 4k do look a lot more realistic than 480P in my opinion. But everybody has different eyes.
 


I forgot to mention, only movies with CGI. Movies without CGI, yes, they look more realistic. But if you watch the Hobbit in 4K you see the crappy CGI throughout.
 


and that remind me of something. i think we know the existence of maxwell since 2009 (i think it was GTC2009 event where nvidia update their roadmap showing maxwell). then by the time nvidia launch kepler we already know the existence of Volta. well stuff happen that pascal end up in between maxwell and volta. but looking at how nvidia talk about their future architecture we probably should see Volta successor being introduce in nvidia roadmap in this year GTC. but so far nvidia haven't talk anything about it.
 


might be some truth to that. but for those that already be on PC gaming for long might notice what you have said. i think Cryek once talking about the same issue. the game polygon did increase significantly more but to the naked eyes it does not bring much of graphical improvement. but since there is significantly more polygon the game also harder to run. but to us the graphic quality barely improved but the new game still significantly more demanding to run. some people even said that the graphical update did not justify the performance impact we are getting in the new game. in the end crytek did say that game developer need to do something different to "wow" gamer and not just keep increasing the polygon count that at one point only giving diminishing returns.
 


i think it somehow become a placebo effect? some people unless they run it on ultra they can't convince themselves to play the game the way it should be played. like unless it is ultra (or max out setting) they will miss something in their whole experience. it doesn't matter if very high and ultra pretty much look the same unless you put the picture side by side they simply feel bad not be able to push the setting to ultra.