G
Guest
Guest
The way I understand it, the refresh rate of your monitor (80 Hz or whatever) is the number of times an actual picture is sent to your screen per second. So if your monitor is set to 80 Hz, it is literally updated 80 times per second. Now, lets say you have a great rig and can get 120 frames per second out of your video card in your game of choice. Isn't the extra 40 frames per second wasted because your monitor is only able to show 80 images per second? Wouldn't it be best to try to match up you're frames per second with your monitors refresh rate?
Thanks,
Mosh
Thanks,
Mosh