Question Is 120FPS much better than 60FPS in games? Is it worth it?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
D

Deleted member 2969713

Guest
I have to disagree, that's impossible. Too many frames flashing in an instance to perceive individually. I think the three of us are interpreting each other very differently.

I don't see individual frames, gaming at 144Hz 144fps is just smoother.
No, you're right, in most cases (like when gaming) you won't. But when an object is moving fast enough, like a mouse across the desktop, you can see that rather than actually moving, the mouse is simply skipping from one place to the next. That's what I meant by perceiving individual frames. It only happens in specific instances with fast enough movement, with the speed of movement required being lower the lower the frame rate is. Try playing a game at 30 FPS and with all motion blur disabled, and rotating a camera. You'll see that rather than looking like smooth movement, it looks "jumpy."
 
My brain does not process visual information in frames like a camera, but that does not mean it is not able to perceive discrete frames.

It doesn't perceive discrete anything... your not even reading at this point. I even went as far as to link a scientific study that went into signaling rates of optical tissues. It had signaling rates, jitter and describe in very scientific detail how the retinal cells responded to various stimuli. Here's a hint, not all cells respond to all stimuli and none have constant signaling rates, instead only signaling under various conditions. The fastest were Brisk-transient with a 171 ± 71hz though Brisk-sustained was 104 ± 38. These cells were only responsible for sending ~6% of all information. The rest was done by Local-edge cells with 74 ± 36hz and much higher jitter.

Try to imagine instead of one or two "screens" your brain has millions each with a different frequency, color pallet and resolution. Your brain then stiches it all together to form a central virtual world of what you are perceiving. It is your brain that "see's" not your eyes, and your brain discards most of the information it receives and focuses just on intense changes or patterns that represent changes.
 

boju

Titan
Ambassador
No, you're right, in most cases (like when gaming) you won't. But when an object is moving fast enough, like a mouse across the desktop, you can see that rather than actually moving, the mouse is simply skipping from one place to the next. That's what I meant by perceiving individual frames. It only happens in specific instances with fast enough movement, with the speed of movement required being lower the lower the frame rate is. Try playing a game at 30 FPS and with all motion blur disabled, and rotating a camera. You'll see that rather than looking like smooth movement, it looks "jumpy."

Yeah, that's motion, you're witnessing faster motion. Mouse movement, scrolling through webpages, games, the display refreshing fast enough so it appears smoother. In that mini second of moving your mouse, even super slowly, i couldn't guess how many frames has been processed, could be in the hundreds? I'm probably wrong.
 
No, you're right, in most cases (like when gaming) you won't. But when an object is moving fast enough, like a mouse across the desktop, you can see that rather than actually moving, the mouse is simply skipping from one place to the next. That's what I meant by perceiving individual frames. It only happens in specific instances with fast enough movement, with the speed of movement required being lower the lower the frame rate is. Try playing a game at 30 FPS and with all motion blur disabled, and rotating a camera. You'll see that rather than looking like smooth movement, it looks "jumpy."

That's not what you think it is. Remember (if you even read it) I said that the brain remembers patterns and likes to discard information and fill in the details from previously detected patterns? Yeah that just happened, your brain saw the mouse start to move, recognized it's motion then stored that an anticipated that motion continuing. But due to mouse crossing more screen space, what the eye balls picked up did not match what the brain expected and it happened multiple times. The brain recognized "something changed" and started to pay attention to the mouse again causing you to notice it. You are not individually noticing frames, you are noticing a pattern not being followed.

Things do not teleport in real life. If a rabbit starts moving in front of you, your brain is going to track it and then start predicting where it's going to be based on previous movement information.

Again this has already been solved, motion video is 29.9FPS NTSC and 25FPS for PAL. This is the point at which fluid motion happens and our brains stop being able to sense individual images. That correlates to a 40ms image to image change rate. Since movies are not interactive, every frame is evenly paced, every change is pre-generated, every pattern is fixed, there is no interactivity so no change for the brain to go "WTF". Interactive media is different and why we want at least 60FPS to handle those "WTF" moments, though more doesn't hurt.
 
D

Deleted member 2969713

Guest
In that mini second of moving your mouse, even super slowly, i couldn't guess how many frames has been processed, could be in the hundreds? I'm probably wrong.
It depends on your monitor's refresh rate, so it'll only be in the hundreds if your refresh rate is in the hundreds.

That's not what you think it is. Remember (if you even read it) I said that the brain remembers patterns and likes to discard information and fill in the details from previously detected patterns? Yeah that just happened, your brain saw the mouse start to move, recognized it's motion then stored that an anticipated that motion continuing. But due to mouse crossing more screen space, what the eye balls picked up did not match what the brain expected and it happened multiple times. The brain recognized "something changed" and started to pay attention to the mouse again causing you to notice it. You are not individually noticing frames, you are noticing a pattern not being followed.

Things do not teleport in real life. If a rabbit starts moving in front of you, your brain is going to track it and then start predicting where it's going to be based on previous movement information.

Again this has already been solved, motion video is 29.9FPS NTSC and 25FPS for PAL. This is the point at which fluid motion happens and our brains stop being able to sense individual images. That correlates to a 40ms image to image change rate. Since movies are not interactive, every frame is evenly paced, every change is pre-generated, every pattern is fixed, there is no interactivity so no change for the brain to go "WTF". Interactive media is different and why we want at least 60FPS to handle those "WTF" moments, though more doesn't hurt.
I don't know what to tell you. I read your posts, but it's like we're arguing in different languages.

You're arguing about the intricacies of how light is processed by the brain, while simultaneously trying to claim that 30 FPS is sufficient for all motion. Obviously I'm not going to be able to convince you otherwise, for whatever reason. But anyone who wants to try it out for themselves can simply move the mouse on their desktop and see otherwise. And I can prove first-hand that something you said is factually incorrect, which throws your whole thesis into doubt.

If you screen is processing 144 frames, would you even know if the 138th one had a red apple in it for exactly one frame? No you wouldn't, not at 120hz, not at 100hz and not at 60hz. It's not until we get down to sub 30hz that changes of that nature even become noticeable.
I wrote a MonoGame game that displays a blank green screen and once every five seconds throws up an image of an apple into the center of the screen for exactly one frame. On my 75 Hz screen, I was easily able to see the apple and perceive it as an apple, despite it only being visible for 1/75th of a second.
 
D

Deleted member 2969713

Guest
Anyway, I have nothing to gain from continuing this discussion, so I'm bowing out now before things get too heated.
 
Higher frame rates/display Hz will be most noticeable with small, fast-moving objects, and a mouse cursor is a perfect example. If anything, you're more likely to notice a difference with mouse movement than with any game.
For me the biggest improvement from the high refresh side was definitely navigation related. The games that I run >100 FPS are few and far between as I tend to like fancy graphics so I try to maintain ~60 FPS lows. Gaming wise the variable refresh rate has generally had a bigger impact than the high refresh. So long as you're able to keep the frame rate high enough to stay out of flicker zones variable refresh rate is fantastic.
Boiling it all down to answer the question of "Is 120FPS much better than 60FPS in games?", the answer is generally no. A 120HZ monitor would allow a wider margin of error for when input misses the render window and has to wait as well as reducing the delta in response time, that has value and is thus better. The second question "Is it worth it?", is subjective and up to the user, though 100+hz screens are pretty cheap nowadays.
The answer is really: it depends on what you're playing and how sensitive to input latency you happen to be. I will say that if one is happy with 60hz then ignorance is bliss as you won't know better first hand.
And everyone seems to forget that consoles were 29.9hz...
Which consoles were those? Consoles all used interlaced output until digital component video connections were a thing so that means ~50/60hz depending on region.
 
Which consoles were those? Consoles all used interlaced output until digital component video connections were a thing so that means ~50/60hz depending on region.

Consoles from before the NES until the past two or three generations were 29.9FPS NTSC signaling. The more recent ones are capable of 60hz via HDMI but most games were locked to 30FPS until the last few years. Interlace is not an increase in signaling rate, if anything it's effectively less then progressive scan. Interlaced lets you get a higher resolution with the same bandwidth by only scanning every other line. First frame scan is line 1,3,5, etc, second is line 2,4,6,etc. It relied on the fact that the phosphorus on TV's and CRT's would continue emitting light for longer then the scan time, though it would often be dimmer.
 
Consoles from before the NES until the past two or three generations were 29.9FPS NTSC signaling. The more recent ones are capable of 60hz via HDMI but most games were locked to 30FPS until the last few years. Interlace is not an increase in signaling rate, if anything it's effectively less then progressive scan. Interlaced lets you get a higher resolution with the same bandwidth by only scanning every other line. First frame scan is line 1,3,5, etc, second is line 2,4,6,etc. It relied on the fact that the phosphorus on TV's and CRT's would continue emitting light for longer then the scan time, though it would often be dimmer.
Except you're completely wrong find any evidence supporting your assertion the NES runs "29.9FPS".
 
Last edited:
Except you're completely wrong find any evidence supporting your assertion the NES runs "29.9FPS".


It took me a minute to realize where you were confused with the whole interlaced thing since that's also used as a monitor mode and in video encoding. Old school displays used our AC power system for it's timing and thus 60hz AC Power becomes about 60hz of electron gun scanning. NTSC would then write one half frame per scan for 30 effective FPS but on a larger resolution that what was practical back then. This signaling is what every RF and RCA cable used and is pretty much "standard TV". And while there component cabling and even SCART, it wasn't until HDTV's with HDMI started appearing that everyone ditched the old analogue signaling standard.
 
  • Like
Reactions: boju

35below0

Respectable
Jan 3, 2024
1,727
744
2,090
Console games were locked to 30 fps until very recently. Until it became impossible to defend it anymore. It's a surprise it lasted as long as it did.

The NES has nothing to do with this, we're talking PS4 era.

As for the whole perception vs reality thing, even if you know better and your senses are lying to you, your senses are all you've got. Keep one hand in icy cold water and the other in hot water, then put both in room temperature water. One hand will say it feels hot, the other that it's cold. Neither is correct but it doesn't matter so much what the temperature is. What you feel is what matters and this is what the original question is all about:
Is 120fps much better than 60fps (in gaming), or is it an illusion? The answer is that 120fps is superior to 60fps *if* you can notice it, and sometimes you *will* notice it. Sometimes. That may be worth it.

30fps is the *****, for reasons palladin explained. 60fps is vastly superior.
120fps is not such a noticeable jump in quality and comfort. The fact that switching the display from 60 to 144 Hz (i know, i know...) when using Windows and moving the mouse cursor or scrolling webpages is far more perceptible than playing games at higher frame rates should tell you that there is a limit to how much more comfortable and useful higher frames really are.
Some people may swear by 360 or even higher rates, but at that point we get further and further into illusion territory. If it feels real and it's worth your time and money to play games that way, then a scientific explanation is not going to change anything.

As a sort of general answer that should apply to everybody, a 144Hz display is nice and not too expensive but even that improvement over 60Hz is barely noticable most of the time in gaming, aside competitive gaming.
Going further up the refresh rate ladder results in diminishing returns, but if those diminishing returns matter to *you*, they are worth pursuing.
 

CmdrShepard

Prominent
BANNED
Dec 18, 2023
531
428
760
If you screen is processing 144 frames, would you even know if the 138th one had a red apple in it for exactly one frame? No you wouldn't, not at 120hz, not at 100hz and not at 60hz. It's not until we get down to sub 30hz that changes of that nature even become noticeable.
I am sorry but that's nothing more than an assumption on your end.

For example, I can see mains powered LED lightbulb flicker at a PWM frequency much higher than 60Hz which is endemic for cheap lightbulbs with poorly done power supplies / LED drivers.

Alot of people who I ask whether they can see it too answer "no". Then I switch on the 240 FPS slow motion camera on iPhone and show it to them and they can't believe that I can see that high-frequency flicker with the naked eye.

Back when we used CRT monitors I couldn't stand anything below 85Hz refresh rate because of flicker in spite of CRT screen persistence. Monitors and phone OLED screens that used PWM dimming for the backlight were causing me great distress and judging by threads like this I am not alone.

You keep going on about how our brain is composing the image and I do not dispute that -- I for one am well aware that the eye is constantly scanning the object we are looking at and that the optic nerve is even being shut off during rapid eye movements to prevent us from getting nauseous and during that time the brain is filling in the blanks.

That however doesn't mean all of us are incapable of noticing changes in motion fluidity which come from using higher refresh rates.
 
Last edited:
I did some gaming first with the LG Ultragear software set to reader mode and another round of Helldivers 2 set to game mode FPS and it did feel smoother the 2nd time. I have no idea what my FPS was though I had G-Sync mode on basic whatever that does. The monitor is 165hz I think.
 

Nyara

Prominent
May 10, 2023
69
60
610
Our eyes don't see frames or scan lines
Wrong, we do. Light (and thus vision) is literally perceived as light waves enters our eyes, once their whole frequency wave goes through from start to end we call that a cycle, hertz are the amount of light wave cycles in a second, and the human eye is equipped to perceive around 1000hz on average (some more, some less). When changes happens at a faster pace, they flicker out or we just perceive them partially, as there is not enough time to process all the light waves or the full extent of the frequencies.

Anything below this threshold can and is in practice perceived smoother, to an extent. The brain becomes the main bottleneck and it can and will ignore received information depending on age, training, genetics, angle (our side vision is usually twice more sensitive), depth, contrast and patterning.

An average person not-teen will not notice much difference when going past 120hz in average PC using situations on their central detailed view. It is also known there is quick diminishing returns when going past 60hz.

Cinema industry did detailed vision studies to identify the lowest refresh rate needed to perceive motion as real (less feels choppy), landed with 23.96FPS and stuck with it. Variable frame rate media adopted 30FPS to avoid dropping below 24FPS. This is literally the minimum for cost saving reasons.

Input lag is a whole different can of issues, and largely a problem created by LCD displays and budget modern era trackpads, keyboards and mouses, and by wireless. 60hz vs 120hz makes frame time move from 12ms to 6ms. Considering the average reaction time (visual identification into action) is of 250ms, and even the most genetically blessed and trained people cannot surpass 100ms, those 6ms gained matters a bit, but not a lot alone. Cheap LCDs (bad response time to signal), older LCDs (all fluorescent and most early LED ones), wireless mouse/keyboards, or wired but bad quality, Windows shenanigans, GPU shenanigans, can easily add a whopping 100ms delay total, and online gaming will add your ping as ms delay too, so things can sum up when combined.

As for 60hz vs 120hz if worth it, kinda personal preference. I have screens of different refresh rates (60, 90, 120, 144), and I'd say that between 60 vs 120, the last one feels "more natural" and "about 30% more smooth" that "suits a bit more nicely", but it really does not matter much unless you become too fixated on the fact (which if you do constantly, like in playing FPS, pick 120hz definitively, then), I prefer visual quality or budget as long as I can keep 60 FPS stable myself.
 
Last edited:

It took me a minute to realize where you were confused with the whole interlaced thing since that's also used as a monitor mode and in video encoding. Old school displays used our AC power system for it's timing and thus 60hz AC Power becomes about 60hz of electron gun scanning. NTSC would then write one half frame per scan for 30 effective FPS but on a larger resolution that what was practical back then. This signaling is what every RF and RCA cable used and is pretty much "standard TV". And while there component cabling and even SCART, it wasn't until HDTV's with HDMI started appearing that everyone ditched the old analogue signaling standard.
This is all well and good, but has nothing to do with the fact that consoles have never run at "29.9 FPS" like you're claiming. The older systems always used ~50hz/60hz depending on region and in the 2D era the vast majority of games ran 50/60 FPS accordingly. It wasn't until the 3D era where consoles simply didn't have the hardware capable of always running 60 FPS and we started getting lower frame rates more often.
 
I read a blog from a console game dev who related a story about a game he was working on that was having performance issues. The story goes he and his team were under pressure to improve performance to hit a 30 fps target. After doing what they could they had failed to meet the 30 fps so they programmed the frame rate counter to show 30 fps so they could show it to the boss.