Question Is 120FPS much better than 60FPS in games? Is it worth it?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

rscheetah30

Dignified
Jun 8, 2018
313
7
15,615
NvidiaCP has an fps limiter built in but not sure if Win 7 driver version includes it? If it does it'll be in manage 3D settings. Riva Tuner and Msi Afterburner can limit fps too if not.

Which games don't run on 10? I don't have issues like that with variety of old games. Maybe you need to install DX runtimes. https://www.microsoft.com/en-au/download/details.aspx?id=35
I haven't found an option to limit fps on neither MSI Afterburner nor Nvidia CP. On nVidia CP you can only turn VSync on of off. Windows 10 and 11's version of nVidia CP probably have that option but I haven't found a way yet of making my games work properly on either.
 

boju

Titan
Ambassador
What about Steam game's? Haven't they stopped Win7 support?

What exactly happens when you try run your games in Win10? List games you're having trouble with. This'll probably require it's own thread not to derail this one. Up to you.


Manual search, input fields for Win10 or 11 and graphics card. Grab latest game ready driver then run installer as administrator (right mouse click) then do custom clean install option.

Link txt in another language but it is US site.
 
And tube TVs, Hz wasn't really an issue till slow liquid crystal was a display.

A screen update is a screen update is a screen update.

I'm just pointing out that most of the FPS-bro stuff is in people's heads.

Our eyes don't see frames or scan lines, in fact most of what you think you see is actually your brain filling in details based on what was previously seen.
 

boju

Titan
Ambassador
A screen update is a screen update is a screen update.

I'm just pointing out that most of the FPS-bro stuff is in people's heads.

Our eyes don't see frames or scan lines, in fact most of what you think you see is actually your brain filling in details based on what was previously seen.

You mean 1000fps man, life or death situation for my pro aim. I know, those CS type of guys are excessive. But they are the ones playing day in and night out so they'd have their reasons.

Motion is enhanced though as you get above 85fps. L4D for example, playing at 144fps is totally different experience compared to 60, it's just a lot smoother. There's also more to playing games than with just your eyesight, peripherals are more on point. Probably why CS guys are the way they are.
 
Motion is enhanced though as you get above 85fps. L4D for example, playing at 144fps is totally different experience compared to 60, it's just a lot smoother. There's also more to playing games with your eyesight, peripherals are more on point. Probably why CS guys are the way they are.

Again we humans do not see in FPS. What most people are thinking of is not the refresh rate but input lag, when they move a mouse and it takes 200ms to register as a result of how poorly a game is handling the input buffer.

At 60hz your screen refresh time is 16.6 milliseconds, congrats your already below the brains ability to distinguish individual signals. At 84hz you are at 11.9ms, at 100 we're getting 10ms then 144hz it starts getting really silly at 6.9ms. The appearance of fluid motion happens at around 41~50ms.

Now there are two things that happen that can screw with those times, double buffering and triple buffering. With double buffering the game engine is not rendering to the screen, it's rendering to the back buffer while the front buffer is being displayed. Then rendering is finished the buffers are switched, if this action is synced with refresh rates it causes the visual delay to be doubled. 60hz goes from 16.6ms to 33.2mz, 29.9hz goes from 33.4ms to 66.5ms, which is why we state that 30FPS is "unplayable". The motion is still fluid as the picture is changing at 33.4ms intervals, but the expected behavior vs the observed behavior delay has risen to above human noticeable levels. Triple buffering makes this even worse, 60fps is now 49.8ms out of sync and to exasperate matters triple buffering frequently gets turned on whenever vsync is turned on. 120hz with triple buffering would have 25ms of response time making it worse then 60hz without.

So while double buffering is probably something we want to keep, triple buffering is something everyone should avoid at all costs.

About human eyes, color vs black and white vision works differently. Color is far less sensitive to intensity shifts and is more geared towards object tracking and recognition. It's primarily in the central cone of our vision not the edges. Black and white OTOH has really high sensitivity to intensity shifts and is concentrated around the edges of our vision. The result is that we are more sensitive to a part of our vision suddenly shifting intensities then that same part gradually shifting. Showing a black screen that has a random white pixel appear on it is something we can detect at high speed, this is the source of that silly "humans can see hundreds of FPS" FPS-bro science that gets passed around. Same test but now we do a jungle vista with a single green pixel having it's RBG value shifted 10 points higher for one frame, nobody is going to notice that regardless of the FPS as our brain just discards that information entirely.

Edit:

One more thing I forgot to mention, delta response which is the amount of change in response times from one set of frames to the next. So lets say your at 60hz double buffered for a 33.3ms response time. That is below the threshold for perceived smooth motion but not by much. Now during a motion the input ends up missing the render buffer and has to be delayed until the next switch causing one frames worth of delay to be added to the response time. From the users PoV the motion feels like 33.3 - 33.3 - 33.3 - 49.9 - 33.3 - 33.3 and so forth. That spike in the middle will cause our center motion tracking vision to feel something is "off". We are still getting 60FPS but the input to display response time got delayed and we will notice the change in pattern. The higher the refresh rate the smaller the potential delta response can be. At 144hz double buffered it would look like 13.8 - 13.8 - 13.8 - 20.8 - 13.8 - 13.8, and so forth.
 
Last edited:
  • Like
Reactions: 35below0

rscheetah30

Dignified
Jun 8, 2018
313
7
15,615
So while double buffering is probably something we want to keep, triple buffering is something everyone should avoid at all costs.
That's interesting. Due to temperature issues. I use vsync with triple buffering on, because nvidia control panel says "triple buffering improves performance when vsync is on". The only game I've noticed is quite laggy
right now is Crysis 2, the only FPS game I really like. This lag, apparently, only affects mouse input, and only in
FPS games. I also play ETS2 a lot with the mouse and so far I haven't noticed any mouse input lag on it.

I have a couple of racing simulators which I also play with my mouse and vsync on and I haven't really noticed mouse input lag on them. It seems to me that only FPS games are really affected by mouse input lag at lower fps.

I will try using vsync coupled with double buffering instead of triple buffering and see if it improves mouse input lag.

PS: At least on Win7, nvidia CP only allows triple buffering either on or off. I am turning it off and will see if it
improves mouse input lag in Crysis 2.
 
Last edited:
The only game I've noticed is quite laggy
right now is Crysis 2, the only FPS game I really like. This lag, apparently, only affects mouse input, and only in
FPS games.
When I read that it makes me scratch my head. It must be your settings. If there was any game that surprised me that plays like butter is Crysis 2 original and Remastered on my old GTX 970 and or GTX 690. Your GTX 1060 is on par with those old cards if not better.

IDK maybe you have more eye candy turned on in Nvidia control panel and or games control panel. The game looks beautiful even with lower demands.
 
That's interesting. Due to temperature issues. I use vsync with triple buffering on, because nvidia control panel says "triple buffering improves performance when vsync is on". The only game I've noticed is quite laggy
right now is Crysis 2, the only FPS game I really like. This lag, apparently, only affects mouse input, and only in
FPS games. I also play ETS2 a lot with the mouse and so far I haven't noticed any mouse input lag on it.

I have a couple of racing simulators which I also play with my mouse and vsync on and I haven't really noticed mouse input lag on them. It seems to me that only FPS games are really affected by mouse input lag at lower fps.

I will try using vsync coupled with double buffering instead of triple buffering and see if it improves mouse input lag.

PS: At least on Win7, nvidia CP only allows triple buffering either on or off. I am turning it off and will see if it
improves mouse input lag in Crysis 2.

By "performance" they mean it makes the image smoother, which it does in theory, it just introduces an entire new refresh cycle of input lag.

Input -> Processing -> Render to Back Buffer -> Switch to Front Buffer (Screen displays)
vs
Input -> Processing -> Render to 2nd Back Buffer -> Switch to 1st Back Buffer -> Switch to Front Buffer (Screen displays)

All rendering is going to be double buffered nowadays, you need to go back to DOS era stuff to find non-double buffering games.

My comment was towards those folks who insist on "FPS ALL THE THINGS" and that high refresh rates somehow make everything "smoother". I described what they were actually experiencing.
 

rscheetah30

Dignified
Jun 8, 2018
313
7
15,615
When I read that it makes me scratch my head. It must be your settings. If there was any game that surprised me that plays like butter is Crysis 2 original and Remastered on my old GTX 970 and or GTX 690. Your GTX 1060 is on par with those old cards if not better.

IDK maybe you have more eye candy turned on in Nvidia control panel and or games control panel. The game looks beautiful even with lower demands.
i have all Crysis 2 graphical settings on Ultra and on nvidia cp it's mostly focused on graphics quality instead of performance.
 
D

Deleted member 2969713

Guest
My secondary screen is still 60hz and even just moving the mouse in windows is obviously different between primary (144hz) and secondary.
Higher frame rates/display Hz will be most noticeable with small, fast-moving objects, and a mouse cursor is a perfect example. If anything, you're more likely to notice a difference with mouse movement than with any game.

I haven't found an option to limit fps on neither MSI Afterburner nor Nvidia CP. On nVidia CP you can only turn VSync on of off. Windows 10 and 11's version of nVidia CP probably have that option but I haven't found a way yet of making my games work properly on either.
NVIDIA's control panel looks like it hasn't been redesigned in 20 years, so I would be very surprised if the Windows 7 version did not have a frame rate limiter. Under "Manage 3D settings," under "Global Settings," is there no "Max Frame Rate option?

I bought a gaming monitor when my 60hz one died. I can't say that I have noticed a difference, perhaps I am too old now.
I like the idea of having a high refresh rate monitor, but I'm worried that if I eventually got one I'd not even notice much of a difference in games, like your experience.

At 60hz your screen refresh time is 16.6 milliseconds, congrats your already below the brains ability to distinguish individual signals. At 84hz you are at 11.9ms, at 100 we're getting 10ms then 144hz it starts getting really silly at 6.9ms. The appearance of fluid motion happens at around 41~50ms.
Huh? I can very clearly see the mouse cursor skipping as it moves on my 75 Hz monitor. Meaning I'm perceiving individual signals above 60 Hz. And the fact that the skipping is very noticeable means I'd likely be able to perceive the skipping at vastly higher frame rates, too. The appearance of fluid motion is very relative to the speed of movement of any individual object when it comes to frame rate.

I guess by individual signals you're referring to the point where a sequence of images stops being perceived as such and start being perceived as movement. However, as above, this would seem to depend on the speed of movement. The mouse cursor looks like it's moving on my screen if I move it slowly, but once I start to move it too fast the illusion of movement is cracked as my brain perceives the individual images of the mouse as it jumps from one location on the screen to the next.
 
Last edited by a moderator:
Errrrrrrrrrrrrrrrrrrrr... huh? Maybe everyone's brains are different, but I can very clearly see the mouse cursor skipping as it moves on my 75 Hz monitor. Meaning I'm perceiving individual signals above 60 Hz. And the fact that the skipping is very noticeable means I'd likely be able to perceive the skipping at vastly higher frame rates, too. The appearance of fluid motion is very relative to the speed of movement of any individual object when it comes to frame rate.

I guess by individual signals you're referring to the point where a sequence of images stops being perceived as such and start being perceived as movement. However, as above, this would seem to depend on the speed of movement. The mouse cursor looks like it's moving on my screen if I move it slowly, but once I start to move it too fast the illusion of movement is cracked as my brain perceives the individual images of the mouse as it jumps from one location on the screen to the next.

You need to read to understand not read to respond.

Human brains do not process in frames but in motion, contrast and change, this processing is be different based on which part of the eye is doing the receiving. Moving a mouse across the screen is more about how the OS handles it and not the refresh rate, in actuality your not even seeing the mouse, instead your brain is just filling it what it expects the mouse cursor to do. If you actually want to understand what is going on then reread the long post, otherwise I'll just lump you in with the FPS-Bro category.
 
Jan 20, 2024
3
1
10
totally, it changed my life... but... the brain get used to 30, 60, 120, 360, over 9 thousand!!!... that's why there is people can't notice shit. Unless you play tinkering.
 
D

Deleted member 2969713

Guest
Human brains do not process in frames but in motion, contrast and change, this processing is be different based on which part of the eye is doing the receiving.
I never said they did. I was addressing your claim that 16.6 ms is below the brain's ability to distinguish individual signals.

In my example, the screen displays a new frame 75 times a second. When moving the mouse, for each frame the cursor is in a different place, for 75 different signals a second. Moving the mouse sufficiently fast in a circle, I can perceive each individual location distinctly.

Moving a mouse across the screen is more about how the OS handles it and not the refresh rate
How so? The mouse position is updated in sync with the refresh rate. When moved, it's in a different place with each display refresh.

in actuality your not even seeing the mouse, instead your brain is just filling it what it expects the mouse cursor to do.
I'm not sure what you're getting at here. I'm certainly seeing the mouse cursor for any definition of "seeing" that is useful.

If you actually want to understand what is going on then reread the long post, otherwise I'll just lump you in with the FPS-Bro category.
That's fine, though I'm not trying to argue that you can't enjoy video games unless you're playing at 240+ FPS. I think 60 FPS is reasonable and it's all diminishing returns the higher you go.

Also, I apologize if I was rude or too combative.
 
Last edited by a moderator:

35below0

Respectable
Jan 3, 2024
1,727
744
2,090
That's fine, though I'm not trying to argue that you can't enjoy video games unless you're playing at 240+ FPS. I think 60 FPS is reasonable and it's all diminishing returns the higher you go.

Also, I apologize if I seemed too combative or argumentative.
I think palladin wrote a very good explanation and it directly answers your question. Even if you disagree, it's worth your time to read the post. Esp. the double buffering and triple buffering side effects.

You definetly notice the mouse cursor skipping but it isn't solely to do with refresh rates. Also, moving the mouse across the static desktop is far from a gaming situation, but i digress.
 
  • Like
Reactions: palladin9479
In my example, the screen displays a new frame 75 times a second. When moving the mouse, for each frame the cursor is in a different place, for 75 different signals a second. Moving the mouse sufficiently fast in a circle, I can perceive each individual location distinctly.

Again your brain does not see frames. You are not seeing 75 mouse images every second.

This is the problem every FPS Bro has, their understanding of images is digital but humans are analogue. Every pixel on that screen is sending out photons, those photons are striking the back of the retina and activating photo-receptors that emit a signal down the nerves into the brain where those signals are put together. That signal is analogue and each type of cell emits a different signal for color and intensity. During processing much of this information is discarded as the brain will stich patterns together and that is what you "see". This is how we can do optical illusions and hidden images by taking advantage of how the brain choses what to process and what to discard. Furthermore the brain will frequently fill in gaps by inserting past patterns into what you see as a shortcut.

To illustrate take that white mouse on a dark background. As you move the mouse across the pixels will go from emitting low intensity photons to high intensity photons, the cells in the eye go from sending low amplitude waves to high amplitude waves. The brain recognizes the significant change in amplitude as signaling "something changed" and focuses on processing that information instead of using previously processed information. Now if it's a dark blue mouse on an almost dark blue background, now the change in intensity of those photons are going to be far less, resulting in the retina cells change in amplitude being smaller and the brain starts frequently discarding that information and you lose track and find it "hard to see". This is something doctors are dealing with now as they have successfully built a bionic eye and restored vision to blind people.

In regards to video gaming, since that was the whole point of this tangent, higher FPS will only help if the frame to frame difference is high contrast, otherwise it'll get washed away. The brain doesn't see frames from eyeballs, but a continuous stream of information is sent over the optical nerve to be processed. Plus object recognition happens much slower then pattern change recognition. It's fascinating to study but is very different from how we display information on a screen.

 

boju

Titan
Ambassador
Again your brain does not see frames. You are not seeing 75 mouse images every second.

That's not what he is saying. I respect your insights but ya getting ahead of yourself here. he is saying he is seeing heightened level of motion not individual frames, myself as well, when i joked about CS guys, i wasn't talking about being able to distinguish individual frames either but it's the rest of it that comes with higher refresh rate allowing for faster motion and it is smoother from me experience. Do you play games or experienced high refresh rates in games? I mean, if you haven't then what's the point debating this.
 
That's not what he is saying. I respect your insights but ya getting ahead of yourself here. he is saying he is seeing heightened level of motion not individual frames, myself as well, when i joked about CS guys, i wasn't talking about being able to distinguish individual frames either but it's the rest of it that comes with higher refresh rate allowing for faster motion and it is smoother from me experience. Do you play games or experienced high refresh rates in games? I mean, if you haven't then what's the point debating this.

That is what is known as a logical fallacy.

I'm explaining in great detail what is happening when you see stuff, at first I was trying not to go too deep.

If you screen is processing 144 frames, would you even know if the 138th one had a red apple in it for exactly one frame? No you wouldn't, not at 120hz, not at 100hz and not at 60hz. It's not until we get down to sub 30hz that changes of that nature even become noticeable. Did you see a green pixel go from 7F to 8F on the 108th frame, then back again on the 109th? Nope your brain completely discarded that information and instead filled in with previously processed information. Now if that oscillation starts happening constantly then eventually the brain will pick up that "something is different" and process it, then store it as a pattern, then repeat that pattern until it notices "something is different" again, long after that oscillation has stopped. Change recognition happens the fastest and is sensitive to intensity of change, object recognition OTOH has a large lag to it. The edges of our vision are more sensitive to intensity of change, the center of our vision is more sensitive to motion and object recognition.

Boiling it all down to answer the question of "Is 120FPS much better than 60FPS in games?", the answer is generally no. A 120HZ monitor would allow a wider margin of error for when input misses the render window and has to wait as well as reducing the delta in response time, that has value and is thus better. The second question "Is it worth it?", is subjective and up to the user, though 100+hz screens are pretty cheap nowadays.
 
D

Deleted member 2969713

Guest
I think palladin wrote a very good explanation and it directly answers your question. Even if you disagree, it's worth your time to read the post. Esp. the double buffering and triple buffering side effects.
I didn't read all of the initial post I responded to because there was no point, because I reject its initial premise, which is that greater than 60 Hz frame rates do not increase perceived visual motion smoothness, and that it instead has to do with input lag. I think the post is undoubtedly worthwhile for explaining aspects of input lag.

You definetly notice the mouse cursor skipping but it isn't solely to do with refresh rates. Also, moving the mouse across the static desktop is far from a gaming situation, but i digress.
I agree about the mouse movement not being comparable to gaming, and said as much in my initial post.

Again your brain does not see frames. You are not seeing 75 mouse images every second.

This is the problem every FPS Bro has, their understanding of images is digital but humans are analogue. Every pixel on that screen is sending out photons, those photons are striking the back of the retina and activating photo-receptors that emit a signal down the nerves into the brain where those signals are put together. That signal is analogue and each type of cell emits a different signal for color and intensity. During processing much of this information is discarded as the brain will stich patterns together and that is what you "see". This is how we can do optical illusions and hidden images by taking advantage of how the brain choses what to process and what to discard. Furthermore the brain will frequently fill in gaps by inserting past patterns into what you see as a shortcut.

To illustrate take that white mouse on a dark background. As you move the mouse across the pixels will go from emitting low intensity photons to high intensity photons, the cells in the eye go from sending low amplitude waves to high amplitude waves. The brain recognizes the significant change in amplitude as signaling "something changed" and focuses on processing that information instead of using previously processed information. Now if it's a dark blue mouse on an almost dark blue background, now the change in intensity of those photons are going to be far less, resulting in the retina cells change in amplitude being smaller and the brain starts frequently discarding that information and you lose track and find it "hard to see". This is something doctors are dealing with now as they have successfully built a bionic eye and restored vision to blind people.

In regards to video gaming, since that was the whole point of this tangent, higher FPS will only help if the frame to frame difference is high contrast, otherwise it'll get washed away. The brain doesn't see frames from eyeballs, but a continuous stream of information is sent over the optical nerve to be processed. Plus object recognition happens much slower then pattern change recognition. It's fascinating to study but is very different from how we display information on a screen.
My brain does not process visual information in frames like a camera, but that does not mean it is not able to perceive discrete frames of movement.

I don't disagree with any of this, and it's a well written post. But, to my understanding, none of it precludes higher frame rates improving visual motion perception.

Until I can no longer see the mouse skipping when I move it fast, there is some visual benefit, however small, to higher frame rates and perception of motion.

That's not what he is saying. I respect your insights but ya getting ahead of yourself here. he is saying he is seeing heightened level of motion not individual frames
Well, I was actually saying that I did perceive individual frames once movement was fast enough. But not that my eyes have a frame rate like a camera. The point being that higher frame rates benefit more the faster an object is moving.

If you screen is processing 144 frames, would you even know if the 138th one had a red apple in it for exactly one frame?
I'm almost certain that you would, if as you said before, the contrast were high enough. If you were staring at an entirely green screen and the 138th frame had a red apple in it, there's no way it would be missed. You probably wouldn't be able to see it long enough to tell that it was an apple, but you'd perceive the change, a red flash. If I had a 144 Hz screen, I could code a little game that would let me test this, but sadly I don't. Mine only goes up to 75 Hz.

Unless you were arguing that 1/144Hz of a second would not be enough to perceive the change as an apple, in which case I think you're probably right.

Boiling it all down to answer the question of "Is 120FPS much better than 60FPS in games?", the answer is generally no.
I agree. But I don't agree that the only difference in higher frame rates is input lag.
 
Last edited by a moderator:

boju

Titan
Ambassador
Well, I was actually saying that I did perceive individual frames once movement was fast enough. But not that my eyes have a frame rate like a camera. The point being that higher frame rates benefit more the faster an object is moving.

I have to disagree, that's impossible. Too many frames flashing in an instance to perceive individually. I think the three of us are interpreting each other very differently.

I don't see individual frames, gaming at 144Hz 144fps is just smoother.