G

Guest

Guest
The way I understand it, the refresh rate of your monitor (80 Hz or whatever) is the number of times an actual picture is sent to your screen per second. So if your monitor is set to 80 Hz, it is literally updated 80 times per second. Now, lets say you have a great rig and can get 120 frames per second out of your video card in your game of choice. Isn't the extra 40 frames per second wasted because your monitor is only able to show 80 images per second? Wouldn't it be best to try to match up you're frames per second with your monitors refresh rate?

Thanks,
Mosh
 

Arrow

Splendid
Dec 31, 2007
4,123
0
22,780
Yes, it's basically "wasted." Monitors simply don't support those high refresh rates provided by today's video cards.

Rob
Please visit <b><A HREF="http://www.ncix.com/shop/index.cfm?affiliateid=319048" target="_new">http://www.ncix.com/shop/index.cfm?affiliateid=319048</A></b>
 
G

Guest

Guest
Should also mention that cards that can get higher fps than your monitor also have higher head room for when the game gets complex in some areas and it will have longer life before you have to upgrade your vid card again.
 

Arrow

Splendid
Dec 31, 2007
4,123
0
22,780
True, thanx.

Rob
Please visit <b><A HREF="http://www.ncix.com/shop/index.cfm?affiliateid=319048" target="_new">http://www.ncix.com/shop/index.cfm?affiliateid=319048</A></b>