I don't know if this is the right place to post but since I use Windows 2000...
I was playing MOH-AA the whole week and it always used 60 Hz as its refresh rate; Just like Max Payne. But Undying for example used 100 Hz and I had no eyestrain. I know these are different games and run on different APIs (OpenGL vs. DirectX) but I was wondering, why am I not able to play at a decent refresh rate while my monitor and graphic card both support it? (GeForce2 MX400 on Samsung SyncMaster 750s 17" 1280x1024@60Hz) I switched to the 800x600x32 resolution (the same setup as the game - my desktop is at 1024x768x32x85Hz) and adjusted it so that it always used 100 Hz, but it didn't help in the games...
I was playing MOH-AA the whole week and it always used 60 Hz as its refresh rate; Just like Max Payne. But Undying for example used 100 Hz and I had no eyestrain. I know these are different games and run on different APIs (OpenGL vs. DirectX) but I was wondering, why am I not able to play at a decent refresh rate while my monitor and graphic card both support it? (GeForce2 MX400 on Samsung SyncMaster 750s 17" 1280x1024@60Hz) I switched to the 800x600x32 resolution (the same setup as the game - my desktop is at 1024x768x32x85Hz) and adjusted it so that it always used 100 Hz, but it didn't help in the games...
