Hey guys.
My questions have always been hard to answer but I think this will be quite simple to some of you.
Can I actually notice the difference between 60 FPS and, lets say, 100 FPS on a 60 Hz monitor?
I know that the refresh rate of the monitor means that it can't refresh itself more than 60 times a second, which means it can't show more than 60 frames per second. However, some people say they have noticed the difference and I couldn't find a specific answer to what I have in mind. All the answers generalized around that I will notice more tearing in games when the gap between the refresh rate and the actual FPS that the card draws bigger.
But theoretically, if I were to utilize VirtuMVP to get rid of the tearing and give me the maximum FPS my card can give on my 60 Hz monitor, would I actually notice a difference or will it be subtle or non existent at all?
Thanks and Happy Holidays.
My questions have always been hard to answer but I think this will be quite simple to some of you.
Can I actually notice the difference between 60 FPS and, lets say, 100 FPS on a 60 Hz monitor?
I know that the refresh rate of the monitor means that it can't refresh itself more than 60 times a second, which means it can't show more than 60 frames per second. However, some people say they have noticed the difference and I couldn't find a specific answer to what I have in mind. All the answers generalized around that I will notice more tearing in games when the gap between the refresh rate and the actual FPS that the card draws bigger.
But theoretically, if I were to utilize VirtuMVP to get rid of the tearing and give me the maximum FPS my card can give on my 60 Hz monitor, would I actually notice a difference or will it be subtle or non existent at all?
Thanks and Happy Holidays.
Last edited by a moderator: