Light shutter on a movie projector isn't instantaneous or so good that it is able to show the momentary stopped frame instantly. Once the movie frame has moved into the lightpath and stops the shutter opening moves across the stopped frame to allow light to pass through the lens and subsequently onto the screen. So in effect there is a fast drawing of the frame on the screen. The shutter then blocks the light passing through the frame undrawing the image on the screeen and the screen is momentarily <b>black</b> while the next film frame is position behind the shutter (Thats right, only light reaching the screen is ambient hopefully in a dark theater). Now the timing of the shown image frame is longer then the switching of the frame behind a closed shutter but both still are happening 24 times a second, a lighted image then blank screen. So in reality it is more like a strobe with a longer period of time with light on then off but still happening at 24 FPS.
So now we are dealing with the human side. Take a light in a dark room and switch it off and you continue to see the light for awhile. Persistance of the eye or the firing of nerve cells in the retina after the lightsource is removed, in other words the eye has a time delay when light is changed allowing a 24 FPS movie to look fluid. In additiion this time delay due to nerve cell response time will cause blurring to fast moving lighted object. Go outside in the sun and place you fingers in front of your eyes with them spread and move your hand back and forth. As you speed up shacking your hand back and forth what do you see? Naturally you will see a blurring of the fingers. Now how many fingers can you count in a second moving across your field of vision moving rapidly? Not many. Still you can tell if your hand is moving faster and faster by the amount of blurring that occurs. Not the number of fingers that you can count crossing your field of vision.
So 24 FPS allows for fluid movement in a movie theater except when the title and credits scolls by then you see some flicker, at least I notice more. Why? Now your eye has something which has more contrast meaning a amplified difference that it can discern especially if the credits are not anti-alias with soft shawdows blending in the back ground with the scolling characters. So a black and white text on a monitor which is being updated at 60 hz (60times/sec) the eye can discern the flickering. Now take that 60 hz monitor and and blend a color background in 24bit with a textured text nearly matching the background and the flickering will go away. I watch my DVD's on my monitor at a 85hz refresh rate but still the frame rate for a movie is, guest what, 24 FPS buddy. The video stream may be 30 FPS, acually 29.97 FPS NTSC but the movie was originally recorded at 24 FPS so really I could be seeing less then 24 FPS. Well when I watch a DVD up close on my monitor it looks pretty smooth running at 24 FPS. Kinda blows that up close monitor theory.
Why wouldn't a game be smooth at 24 FPS? You said it in your discussion, freeze the movie frame on the DVD and it is blurry with fast pace moving objects. Think about it. When you where moving your hands back and forth you could tell the speed of your hand by the amount of blurring you saw. Now with a DVD, (film at 24 FPS converted into mpeg2 format), has a recorded images from the film which exhibits properties similar to what the eye does; blurs when motion is present. So basically our brain is confusing the blurriness with motion like with your hand outside. So 24 FPS can look good on a high resolution monitor a.k.a DVD as well as on the movie screen.
A Video camera is sorta the same way and also has persistance and will blur naturally if the motion is moving fast enough. So 30 FPS seems to be ok with video but not perfect. If the image has alot of subdue colors then the eye pretty much thinks it is fluid. If you run your computer to your monitor and place black and white text on the screen, well you know, it sucks.
Now most video games don't have much blurring routines involved so only the human eye limitations come into play. So like you said some people can tell the difference between 30FPS and 60FPS or can they? Well if we are dealing with people I guest there will be some difference in ability of the person to discern FPS. I doubt anybody could tell me FPS of a giving image as in that one is running at 42FPS and that one is running at 48FPS. NOT!!!!! But how about 30FPS and 60FPS. Well once again games don't work that way. If you ever run 3dMark2000 you will notice that the frame rate is moving all around the place. One moment it is at 140FPS and at the next 42FPS. Did you notice the slowdown? Probably not but once the FPS gets less then your tolerance zone (maybe 18FPS, 20FPS and if you really good 30FPS) then you will notice hey man that is jerky.
Do you really think a computer game runs at a constant 60FPS and that is the magic number for smoothness? Benchmarks give averages and the game may never even play at that frame rate anytime during the whole session. In reality the game frame rate is jumping all around the place it is only when the frame dips below you tolerance or threshold can you say hey man this is jerky and I got Fragg!
One more item, your monitor imposes its own limitation on FPS, if your monitor is at 60hz refresh rate (all scanlines, pixels etc) meaning being updated 60 times a sec then any frame rate greater than 60FPS would not be displayed on the screen.
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 02/27/01 05:29 PM.</EM></FONT></P>