There several different issues in your post. I'm by no means an expert, but I'll try to shed some light on it (or increase confusion
):
-Human vision:
It is a quite common misconception that the human eye can see just 30 fps (actually I think the common image rate is 24 in the theater and 30 on tv). The human eye "checks" reality over 1000 times per second (I think is over 2000) - but our vision isn't a camera (we see with our brains), and there is a "memory" effect particularly in the relatively slow color sensors (cones?). To further complicate things, eye sensitivity varies among the human race (like auditive abilities) - so I may not perceive the difference between 85hz and 100hz but other person could.
-Refresh rate (RR) vs FPS: RR is the number of times the image is updated - it is needed to create the ilusion of an image. The FPS is the number of frames - needed to create the ilusion of movement. Obviously, the RR limits FPS (frames not shown do not help the ilusion of movement).
-RR - sore eyes:
Low RR do stress our eyes - the higher the better, but 85hz is generally enough. Some people feel improvements over 100hz. I can't remember why they get sored but I *think* that in lower refresh our eyes have to focus more to mantain the image - we don't blink as often and get sore eyes. Note: not many monitors can do better than 120 at over 1024x768.
FPS - films vs games:
24 is enough in the movies. Why do we need 60 in games? We see with our brains - with a bit of help from our eyes
. A movie, in a dark environment, indirect light and blurred images, 24 FPS is enough to fool the brain to think there is movement. A game, direct light, small field of vision, well defined and different images at a high speed movement (like a 90º turn in Quake) may need twice the FPS to fool the brain. A slower paced game wouldn't need as many FPS to create the ilusion of smooth movement.
Hope this helps, and please feel free to correct anything not true.
Arbee