Well, I tested my Quake4 frames and found that I get 60 usually (I have the vsync on, can't stand texture tears) and ~25 when a good firefight is on.
As for whoever thinks they need 120fps, sorry but you're an idiot. You eyes work just like everyone elses, which means they can interpret 30fps max. But hey, if you want to spend you money on vid cards, I'm sure someone's happy about it.
You hear sound in games within 2 ms of the sound source.
Your mouse movements take about 1 ms (or less) for the PC to receive.
Your game at 30 fps is (1000/30) drawing images on the screen 33.333 ms 'behind' the event.
That is quite a delta there between sound (unless delayed by game by distance, etc but even so) and video.
30 fps is where motion looks smoth to most people.
60 fps (16.667 ms) is what most decent gamers want minimum, as they can see and feel the difference.
120 fps (8.333 ms) is on par with the draw time of a decent TFT or CRT screen.
Since you think the difference is so low can you run 30 Hz on your monitor with V-sync to cap your frame rate at 30 fps ?
God, when under 40 fps I can feel it in the mouse, and see the difference between 40, 60 and 120 fps on screen (and yes 120 is over my refresh rate, but you still feel the difference).
Instead of frames per second, when talking highly interactive games (not non/low-interactive movies or film) people should really say "simulation cycles per second".
I feel sorry for people who can't perceive the difference between 60 and 80 fps, let alone 60 and 30 fps:
Would you rather be 8ms behind the server, or 33ms ?, or can't you notice the difference there either.
Then again, maybe not:
I suspect not all people in a large sample can see the difference between 60fps and 80fps, and/or it may take years of 'experience' between the two to start seeing it
33.333 ms for frame draw time, plus the players reaction time to what they see, say 45 ms total, is a pathetic reaction time in combat.
8 ms, plus a decent 5 ms reaction time, for a total of 13 ms is a far more acceptable reaction time in combat.
Run Half-Life 2 with "fps_max 30" for a week, and then do it at "fps_max 120" on a decent PC. You'll at least see, and feel (via interaction), part of the difference. A bullet (in VBS1) will travel around 25 m in 33 ms, not reacting until after your dead doesn't look smart.
😛
http://www.virtualbattlespace.com
http://www.virtualbattlefieldsystems.com
http://www.google.com.au/search?hl=en&q=VESL+VBS1&btnG=Google+Search&meta=
Film
lacks interaction, so a initial delay of 33 ms doesn't matter, so long as the movement is smooth enough to fool the senses. Watching a movie requires little eye movement, and slow reaction time isn't going to change the experience much.
It has already been suggested for fast paced sports that they want to move to 60 fps so people see everything that happens.
================================================
Sure, 30fps is enough for single player, but when you're online you need at least 40fps minimum, and an average close to 80fps just to keep up.
One day I might film a fast paced game of Half-Life 2 DM, just so people know how fast it really is.... when converted to 30 fps WMV (assuming no speed blur, frame averaging 4 frames into 1 film frame) you would see people die and not even notice where the shot came from.
😛
Going from 30fps to 60fps will actually drop your 'game ping' by 16ms or so when playing online. (Game ping includes game engine time, and frame draw, network ping does not
😛). You'll notice my ping in that screenshot is much lower than everyone else. As a result my Kill : Death ratio is 27:0 (= infinite K/D ratio, as you can't divide by zero
😛).