[citation][nom]supall[/nom]I wasn't aware that hz and fps were interchangeable.[/citation]
in this particular case they are. A refresh rate of 30Hz, means a frame rate of 30fps. There are other times that this is not true. For example a 240Hz TV has a refresh rate of 240 individual frames every second... however, the picture displayed may still be 24 or 30fps because that is all that the input controller can handle (or 60fps if doing 3D).
What does that higher refresh rate give you if it is only showing 30fps? Quite a few things it turns out; With a higher refresh rate manufacturers can put in panels that have more sensitivity which leads to less ghosting, and much faster transitions between colors and light-dark scenes. Another thing that happens is less input lag as the screen has more opportunities to change what is displayed, and can comply with said changes more quickly (though post-processing can still introduce plenty of lag, so there is often a 'game' mode which will turn much of that off). There is capacity for more detail on moving objects as there is less need for motion blur. And the biggest thing is smoother pans and transitions across mediums, as well as more life-like motion for high action scenes (The Hobbit HFR as an example). As an example, lets say you have an old 30Hz TV, and you are watching a 24FPS movie (DVD). These 2 frame rates are frankly not compatible, and so some frames must be duplicated in order to pad out the time because the TV is going to display a frame every .033 seconds regardless of the input. This causes things like a pan to be smooth for the most part, and then seem to studder or jump once a second. So then you upgrade to that 60Hz TV, all of your TV shows play back fine as they are 30fps, and merely duplicate each frame twice, and 24fps material has now ~2 smaller jumps every second, which is not noticeable by most people. But then you upgrade to a 120Hz TV, and now you start to understand what smooth is. for 24fps material each frame is displayed 5 times evenly, 30fps material is displayed 4 times, so that everything moves predictably, and mixed format media (specifically Anime which is notorious for using multiple frame rates for drawn vs CG content) begins to look fluid and less distracting (granted other shortcomings of the medium become more distracting). Bump it up to 240Hz, and it smooths it out even further (though I suspect it has more to do with better algorithms on the better screen than pure frame rate at this point).
At any rate, 30fps is just fine for console gaming. Consoles hook up to TVs which nativly take a 30fps input (60fps 3D), and would simply not benefit much from having a higher frame rate as the extra frames would simply be ignored. This is like telling PC gamers that anything above 60fps is useless (unless you have a 120Hz panel), because anything above 60fps is simply not displayed because the input controller cannot take it. But there is a huge and very notable difference between 30 and 60 FPS, and it is very noticeable when it comes to the clarity of moving objects. So consoles are now stuck at 1080p 30fps for the next 6-8 years, while PC gaming is moving to 1080+ resolutions and 120fps. But this has more to do with TV technology than having anything to do with console design. The consoles will have 60fps capability (3D), but the TVs simply will not accept or ignore such input, so why bother enabling it?