Ok folks... don't hate me or make fun of me for this question.. I'm just a network admin and VB .net coder, so I know almost zero about gaming...
Here goes...
My understanding of the "refresh rate" on my monitor is that it is essentially the number of times per second the image gets repainted. For example my old CRTs refresh at 85-120 hz (85 to 120 per second) while my LCDs are usually 60 hz (60 times per second) with a few LCDs doing either 72 (for 3-3-3 dvd pulldown) or 75 HZ.
So here is my question....
When a video card for games does, say 200 frames per second, what good does it do me?
I can see why you would not want your frame rate to drop below the refresh rate of your monitor, but when frame rates exceed the refresh rate, what good does it do you?
If your monitor only refreshed 60 times a second, what benefit do you get from say 180 fps from the video card?
Here goes...
My understanding of the "refresh rate" on my monitor is that it is essentially the number of times per second the image gets repainted. For example my old CRTs refresh at 85-120 hz (85 to 120 per second) while my LCDs are usually 60 hz (60 times per second) with a few LCDs doing either 72 (for 3-3-3 dvd pulldown) or 75 HZ.
So here is my question....
When a video card for games does, say 200 frames per second, what good does it do me?
I can see why you would not want your frame rate to drop below the refresh rate of your monitor, but when frame rates exceed the refresh rate, what good does it do you?
If your monitor only refreshed 60 times a second, what benefit do you get from say 180 fps from the video card?