LordVile :
It’s also hard because the number of continuous frames vs seeing one specific frame is different. If you have 1 white and 399 black frames at 400FPS people can see the white frame, doesn’t mean the could watch a movie at 400FPS and see everything
This is pretty much correct. While we only really "see" in about 24fps, we can "notice" dramatic change at
much higher rates ("noticing" and "seeing" are not the same at all). Above 90fps we can't really detect difference in
framerates anymore with our eyes, but we could of course still easily see what out brain would pick up as "anomaly" such as screen tearing or a white frame amongst black. .
Frame syncing
has lessened the notice-ability of difference though by a fair margin, as what was often the thing that allowed "detection" was frame tearing - as an anomaly, and the faster that anomaly occurs the harder it is to detect - so that's why you have people saying they can see the difference between 120 and 144fps in games; with something like G-sync enabled, I'll bet those same people would be hard pressed to tell the difference between those two rates in any game.
For the claims that people can game better at 144 vs 90 or 120, so they "need" 144, I highly doubt it. Being good at a game has 99.9999% to do with everything else at that point, and 60fps is enough for 99.999% of all people on the planet. For online games, any latencies in the syncing will outstrip any perceived advantage gained by having a higher fps than say 60 or 90. So while one might detect a difference on screen, in game difference will likely be somewhat nullified.
Der8auer did a blind monitor refresh test, and he could (just
barely )tell the difference between 120 and 144, but I am sure it was the screen tearing he was noticing - not actual framrates.