Everyone here apparently hates to admit it, but the push for more FPS has always been more fluff than anything about proper rendering of content. 30fps is the average rate at which normal human sight captures the real world. Increasing the rate at which images pass on a screen works for shooters and certain other games because you're goal is to advance the speed at which things in the game happens, thereby increasing potential difficulty without having to adjust the sophistication of the AI.
Unfortunately, if you're looking at anything cinematic, you're essentially forcing the brain to recognize an image at a rate it can keep up with but isn't used to seeing, thus the image itself looks "off". Because you're not engaged in anything else unlike in a shooter, that detail becomes noticeable. If you're in a shooter or a racing game, that impression is diminished because your brain is already processing your actions and concentration for your actions in the game. Essentially your brain tells itself to ignore that sensory detail, it's busy with other stuff for now.
Given how cinematic many games are today, particularly Ubisoft's titles, it makes a lot more sense to just stick to 30fps throughout rather than trying to jump about from 30 in the cinematic portions to 60 in the less cinematic parts, all for a minimal to reversal of gains in image quality just to satisfy those thinking more fps is better.