The math is simple really but feel free to correct me.
If you have a refresh rate (in case of LCD response rate is the proper term really) of 2 milliseconds, then you can refresh/change the pixels on the screen 1/0.002 = 500 times a second, hence referred to as 500 frames per second, or 500 fps. (8ms rr = 125 fps, 12ms = 83 fps, etc). Sending over all pixel related information to the LCD in order to get it displayed is a lot faster than milliseconds, so effectively all pixels of the LCD are changed/updated in parallell (if it wasn't you'd get to see all kinds of weird mismatches on parts of the screen; the actual problem with older screens was "ghosting", a sort of afterimages of pixels not getting refreshed in time for keeping up with the game framerate).
There used to be a difference between frames and a full screen in analog television, where a frame is only half a screen and two frames interlaced make a screen (PAL, NTSC). This is not the case with monitors (CRT's or LCD's): a frame is a full screen.
Now that I am at it, there is some common misconception as to that your eyes can only absorb about 25 to 30 fps. This is utter nonsense. Two things are confused here, the framerate required to give the human visual system the suggestion of fluid motion (on the screen, which is indeed 25 to 30) and the ability of the human visual system to detect changes on the screen within a certain timeframe. I've read on the web somewhere that jet fighter pilots are specifically tested for this, and there are known cases of people having a detection rate over 300 fps (meaning they detect changes between two frames when the screen is refreshed over 300 times a second).
So if some gamer is complaining about getting shot or missing shots because of his 100 fps, there is a (slim) possibility that he's not bragging. Needless to say, the 2ms LCD's will make an end to that debate from the LCD side of the equation. Then framerates will really be determined by only the cpu and the gpu.