Nvidia GTX 1660 Ti Reportedly Up To 19 Percent Faster than GTX 1060

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


This has been debunked so many times that it's become an internet meme at this point. How many fps a person can perceive is a tricky question to answer because it's not a binary detect/not detect issue, but that number is well above 60 fps.
 
It’s also hard because the number of continuous frames vs seeing one specific frame is different. If you have 1 white and 399 black frames at 400FPS people can see the white frame, doesn’t mean the could watch a movie at 400FPS and see everything
 


This is pretty much correct. While we only really "see" in about 24fps, we can "notice" dramatic change at much higher rates ("noticing" and "seeing" are not the same at all). Above 90fps we can't really detect difference in framerates anymore with our eyes, but we could of course still easily see what out brain would pick up as "anomaly" such as screen tearing or a white frame amongst black. .

Frame syncing has lessened the notice-ability of difference though by a fair margin, as what was often the thing that allowed "detection" was frame tearing - as an anomaly, and the faster that anomaly occurs the harder it is to detect - so that's why you have people saying they can see the difference between 120 and 144fps in games; with something like G-sync enabled, I'll bet those same people would be hard pressed to tell the difference between those two rates in any game.

For the claims that people can game better at 144 vs 90 or 120, so they "need" 144, I highly doubt it. Being good at a game has 99.9999% to do with everything else at that point, and 60fps is enough for 99.999% of all people on the planet. For online games, any latencies in the syncing will outstrip any perceived advantage gained by having a higher fps than say 60 or 90. So while one might detect a difference on screen, in game difference will likely be somewhat nullified.

Der8auer did a blind monitor refresh test, and he could (just barely )tell the difference between 120 and 144, but I am sure it was the screen tearing he was noticing - not actual framrates.
 


I would generally agree, but I do somewhat disagree that higher than frame synced 60fps is "required" or somehow magically makes one a better online player.

If the margin of sync error is more than 7ms, then a frame at 144fps is smaller than the margin of error. If the standard deviation of human reflex response is more than 7ms, then it is also smaller than that margin of variance as well.

Don't get me wrong - I am agreeing with you, but at the same time, I can see a valid argument for "meaningful difference to the average gaming experience" vs "I can notice something is different"

If your livelihood depends on you winning the grand champion gaming tournaments, youre not going to want to take any chances, but again "not wanting to take chances" vs "meaningful difference to the average gaming experience" should not be conflated either.

My 2 cents.
 
Depends what games you play Tbf. For something like say Skyrim having a 4K max settings on a good 60Hz IPS panel would be a much better experience than a 240Hz 1080p TN. For CSGO or Overwatch you’d have the higher refresh rate.

And for online play anyway your ping is more of a factor first. On a good ping then you can talk refresh rate.