Problem with watching recorded gameplay is that the video has been compressed, usually more than once. This can smooth out things as well.
3D games are typically rendered with the entire scene in focus. Since your eyes are focusing on the screen in front of you it makes sense. Added motion blur is to make it seem more like TV, which some people prefer.
Cameras have focal points, so when you record footage you only see what the camera sees. The background/foreground will be out of focus. This makes the objects in focus stand out. So when actors or object move you see a blur line, vs just the pixels changing when the object moves directly.
60 FPS video can often have a 'soap opera' effect. In American television, most studios recorded TV at 30 FPS to more correctly match the 60FPS refresh rates on TVs. This lead to a softer image than 24 FPS films. PAL always used 25/50 so their conversion was easier.
This is less and less true with frame interpolation on 120hz and 240hz TVs. Though that in itself is about the post-processing and not the actual refresh of the screen.
The practical effect of playing a game at 60FPS vs 30FPS is in reaction time. If you can see something you react to it. Though 16.67ms/33ms might not seem like a huge difference, if you combine it with net lag and game engine latency, it can provide an advantage. If you are talking single player games, the computer is running in essentially real time. So you being able to see their actions earlier may help out.
If it is just for the visuals, you don't strictly need to get the performance necessary for 60FPS.
There are probably better explanations out there, as this topic has been brought up many times.