Anarkie13 :
If my memory serves, television and movies have historically been broadcast at 24FPS in the past.
Gaming at 30 FPS is considered the minimum by most.
Movies are historically 24 Hz (really old ones were 18 Hz I think? Which is why Charlie Chaplain movies seem to be sped up - they are). TV programs are historically 60 Hz interlaced (one complete frame every 1/30 sec) in North America and Asia, 50 Hz non-interlaced in Europe.
For gaming, it depends on the type of game. 30 Hz is a general minimum for twitch games like FPSes and scrollers - basically anything where you have to aim quickly and accurately. 12-15 Hz is a more realistic minimum for exploration games like Minecraft. Obviously your gaming experience will be more pleasant at higher framerates.
60FPS is (was) considered the sweet spot as it matched the Hz of the display and rendered animation much smoother.
Nowadays, thanks to tech advancements, 144Hz is possible, even though it's pretty much beyond our visual spectrum. However, this overkill ensures to max out your potential part of the equation.
It varies by person and what you're looking at. Small changes are not as visible as large changes. I'm one of those individuals cursed with being sensitive to PWM - the rapid flickering on/off of LED lights to simulate a dimmer light output. I can notice it up to about 800 Hz, though it only bothers me up to about 250 Hz. Yes I can see older fluorescent lights flicker due to the 60 Hz AC current (causes a 120 Hz flicker). Driving at night behind certain cars gives me a headache because some manufacturers cheaped out and used taillights which have PWM (flicker) at 60-120 Hz. (Curse you GM and Nissan. Lexus, Infiniti, and Ford have increased their PWM frequency out of the annoying range the last few years.)
For smaller changes in intensity like video, I'm fine with 60 Hz, though the difference compared to 120 Hz (or 144 Hz) is visible.
k1114 :
You can easily see above 100fps and you can easily find the research to prove it. The soap opera affect is very noticeable and all you have to do is look at it for a a split second and notice a difference if you ever looked at a 120hz+ tv.
I don't think the soap opera effect displays at 120 Hz. The soap opera effect comes from interpolating 24 fps movies to 60 Hz. The whole point of using 120 Hz is so that you can display 24 fps without having to do interpolation (display each movie frame for 5 TV frames). 24 Hz doesn't divide evenly into 60 Hz so you end up showing individual movie frame alternating for 2 or 3 TV frames, which causes judder.
You *could* interpolate up to 120 Hz for the soap opera effect. But it's hard to imagine TV manufacturers throwing in the extra money for processors which can interpolate the video at 120 Hz, when 60 Hz is "good enough". On a lot of TVs, the processors seem underpowered even for smart TV functions like Netflix or YouTube apps.
I haven't seen so many people mention not being able to see above 60fps in a long time. Before lcd, most crt were above 60hz. The display industry didn't just want higher hz recently. Tv is 30hz ntsc and movies are not all 24hz.
NTSC is 60 Hz interlaced. Half the screen is refreshed every 1/60 sec, resulting in the smoothness of 60 Hz, but the illusion of full resolution. Broadcast HDTV is either 1366x768 @ 60 Hz, or 1920x1080 @ 60 Hz interlaced (1920x540 every 1/60 sec). Most TVs for the last decade are progressive scan though - they capture the interlaced signal in a buffer, and combine it with the previous interlaced frame to interpolate the full frame from it. Then display the full frame at 60 Hz.
With vysnc on a 60hz display and you drop below 60fps, it causes micro stuttering which is why you are noticing the lower fps so much. Smoothness is more about frame time rather than fps. This is why many pro reviews are not just giving fps data.
Actually, if this is a regular Nvidia Optimus laptop, vsync is essentially always on. With Optimus, the Intel integrated GPU drives the screen. The Nvidia GPU acts as a co-processor. When the Nvidia GPU completes a frame, it sends it to the Intel GPU for display.
There are a few gaming laptops which let the Nvidia GPU drive the screen (or external monitor) directly. But those are extremely rare.