I don't really understand what you're wowing about. I pointed out that the article had nothing to do with what we were saying and was primarily concerned with making some inference from framerate to how involved in a movie you are. Again, LOL.
The 30fps looks choppy to you? Not to me. Sure, not as fluid as the 60fps, but that's not the same as saying it's choppy. Maybe I just have lower standards than everyone else, I dunno. I'm sure 60fps doesn't look as good as 120fps, but that doesn't make 60fps choppy.
I suppose the principle difference here is blurring. With movies, things don't get choppy or stutter even when the object is moving fast b/c the object blurs each frame. This means that object has more continuity because there is a more fluid transition from one frame to the next. In the case of video games, this clearly does not happen - there are no blurred images, just sequences of highly detailed, everything-in-focus frames. This would mean that fast-moving things would jump a further distance from frame to frame, and appear clear but choppy.
So yeah, that article does point out a good difference between movies and video games - motion blur. Motion blur makes 24fps look a lot more fluid at normal speeds, since clear and distinct objects aren't hopping large distances between frames (like they are in choppy video games). Does motion blur make it such that we're comparing apples to oranges, though? I think all this means is you need more fps in a game to make sure it's not choppy, so just think about the fps as being on a different scale, which is pretty much what the article says - they claim 120fps would still not be enough. That might be true to exceed the human eye's capability for noticing choppiness in a side-by-side comparison (e.g. 3000fps vs. 5000fps), but that seems to be an extreme standard for choppiness, generally construed. I mean, if we take them seriously, we'd have to call a game running at 120fps choppy!
I stated my original point badly. The main thing I was trying to point out is that a framerate in the 20's would not be unplayable. I slipped up and stated that point a little unclearly - that because 24fps in a movie wasn't unviewable, a framerate in the 20s shouldn't be unplayable for a game (which implies I think they are the same). However, I stand by that statement because they are not sufficiently different to call movie 24fps perfect and video game 20-29fps completely unviewable. So long as that's not the case, my point still stands. I don't think they'd be the exact same experience, as you correctly pointed out they would not be, but given that 18fps is sufficient for no choppiness whatsoever (and I'd still consider a movie "watchable" even if it was a bit choppy), I think that means a framerate in the 20s for a video game is "playable." When I go and look at those little boxes bouncing in the first link, playing a game at 30fps seems very doable. Also take into consideration that those boxes are not actually displaying their frames in a second - the boxes get desynchronized after a little bit. If the 30fps box was running 30 frames in a second, it would look a bit smoother and the box would move just as far as in the 15fps box.
Sorry if I offended you or anything, I just thought that first link was terrible, and some of the claims the dude made were ridiculous.