wrong!. 12.5 fps is the point the human brain starts to perceive smooth motion. its why pretty much every warnerbros and merry melodies cartoon has 12.5 fps as that was the minimum they could get away with to keep costs down but still provide a smooth playback..
25 fps became a standard when sound was introduced. again it was the minimum they could get away with while maintaining motion and syncing the sound. it has nothing to do with your eyes. its all about cost.
after that 25 fps was adopted as a standard. things pretty much hit a status quo till digital arrived and because it is a much sharper image they had to introduce 2 things a slightly higher fps 29-30 and fake motion blur.
60 fps isnt even a thing in movies currently the highest fps standard is 48 fps for ultrahd imax and the likes.
for gaming again its a cost thing. as for the 2 foot thing lol...
it makes little difference how close you sit to the screen if your looking at 30 fps at 2 feet away and switch to 60 you will see it exactly the same way as you would see it 15 feet away. the image looks sharper because it is.
25 fps became a standard when sound was introduced. again it was the minimum they could get away with while maintaining motion and syncing the sound. it has nothing to do with your eyes. its all about cost.
after that 25 fps was adopted as a standard. things pretty much hit a status quo till digital arrived and because it is a much sharper image they had to introduce 2 things a slightly higher fps 29-30 and fake motion blur.
60 fps isnt even a thing in movies currently the highest fps standard is 48 fps for ultrahd imax and the likes.
for gaming again its a cost thing. as for the 2 foot thing lol...
it makes little difference how close you sit to the screen if your looking at 30 fps at 2 feet away and switch to 60 you will see it exactly the same way as you would see it 15 feet away. the image looks sharper because it is.