News Nvidia Explains Why it Thinks High Frame Rates Matter in Competitive Games

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I have a 2080 Ti and still think this is marketing. I chose a low lag 4K60 IPS over an Alienware 1080p 240Hz TN mainly because no GPU can do 240 fps on all games now. Also those high refresh rate panels are TN with inferior picture quality and colors.
I still don't get how there is still no OLED monitors in the PC market after more than 12 years from the 1st OLED TV. At least give us 1440p 120Hz OLED. PC monitors should be more advanced versus TVs.

144Hz makes a difference for high paced FPS games but skills still play a big role. Most people game on 60 fps yet there is still many noobs. A noob is still a noob even with a 240Hz screen. Those who get better with 240Hz are the pros with fast reaction time skills.

My next monitor upgrade will be OLED 4K 120Hz+ from this https://www.rtings.com/monitor/reviews/lg/27uk650-w which is one of the fastest 4K IPS panel I found.
 
Last edited:
This argument some of you have against high framerates is stupid at best. Just because you don't "believe" there's a difference, doesnt mean there isn't a difference, there are plenty of people capable of perceiving the difference between 60fps and any other framerate. If YOU don't "want" to buy equipment to support high framerate gaming, that's a "YOU" problem. Arguing that because you're not able to tell the difference, no-one else can, is bad logic. Holding on to old, outdated testing based on watching video and not interacting with any kind of interface physically that has an effect on that video, is bad science to support your bad logic. Yes, there is a point of diminishing returns, and a double blind, wide ranging study would be great! Are you going to fund it? Probably not since you asked for a source, then denied even reviewing that source because your to damn lazy to "watch" a video. Flawed testing aside, it still has a point. Some people, obviously not you, benefit from high framerates, and can easily perceive the difference, while other either have a hard time perceiving the difference, or can't alltogether.
 
  • Like
Reactions: LB23
This argument some of you have against in favor of high framerates is stupid at best. Just because you don't "believe" there's a difference, doesnt mean there isn't a difference,
(claims stated with no scientific, proven basis snipped)


Fixed that for you.

Videos? Yeah, sure. And, I note you entirely avoided even acknowledging the link I posted about the fastest human reaction time record - is it because that doesn't work for the conclusion you want to believe in?

Show me research, with repeatable results.

TL;DR - you made a lot of the same kinds of claims other people have made. Now PROVE those results.

And no, it's NOT lazy to not want to listen to a half an hour of people talking about it. People can read far faster than words can be spoken. Not wanting to scan through a video to get data isn't laziness on my part. It's laziness on YOUR part not to provide links to properly documented research to back your claim.
 
  • Like
Reactions: TJ Hooker
At work, so, no. Also, I'm really not at all willing to slog through a half hour of video. Is their analysis, in detail written up somewhere such that I (and others) can read it?

If not, then if they couldn't be bothered to detail out their process in writing, then why bother?

"Oh, the YOUTUBE CHANNEL did not write an article. This makes all the facts invalid". Get out of here man!
Also, you reaction time-record-link is invalid for multiple reasons:
  1. Reacting to one image is not remotely the same as the benefit of a more fluid motion, and the extra information and increased perceived sharpness that comes with it.
  2. It somehow supports the thing that you are trying to disprove. If you can have your image 10 ms faster, your reaction time will be 10 ms faster. Whatever the human reaction time.
Reaction time is still just a small part of what makes heigh refres rates great.
 
Last edited:
I have a 2080 Ti and still think this is marketing. I chose a low lag 4K60 IPS over an Alienware 1080p 240Hz TN mainly because no GPU can do 240 fps on all games now. Also those high refresh rate panels are TN with inferior picture quality and colors.
I still don't get how there is still no OLED monitors in the PC market after more than 12 years from the 1st OLED TV. At least give us 1440p 120Hz OLED. PC monitors should be more advanced versus TVs.

144Hz makes a difference for high paced FPS games but skills still play a big role. Most people game on 60 fps yet there is still many noobs. A noob is still a noob even with a 240Hz screen. Those who get better with 240Hz are the pros with fast reaction time skills.

My next monitor upgrade will be OLED 4K 120Hz+ from this https://www.rtings.com/monitor/reviews/lg/27uk650-w which is one of the fastest 4K IPS panel I found.

Why not improve skill AND gear? 😉
OLED is not popular for pc's, sins they suffer from burn in. Static elements like the taskbar would burn in real fast. The burnin problem has gotten less severe over the last year, and you can now get a few OLED monitors :)
 
  • Like
Reactions: Zizo007
"Oh, the YOUTUBE CHANNEL did not write an article. This makes all the facts invalid". Get out of here man!
(nonsense afterward deleted)

The Youtube channel? Linus Tech Tips' YouTube channel, you mean? That is the video we're talking about, right? You ARE aware that they have an actual web site, yes? Granted, it's a forum, but if there was something substantial, it would likely be there, no? And, perhaps someone who's defending the "higher refresh rates can be seen by the human eye" position would know and have read it, if such a post existed.

So, no, I'm not saying that a video is necessarily invalid, but that's not the ideal go-to for something that's being used as scientific evidence. And, even then, any old article is meaningless in and of itself. A writeup of their full testing methodology.

But if there is no write up, then no, I'm not wasting my time. I'm not going to slog through a lot of chatter because nobody can be bothered to provide solid evidence with reproducible testing methodology.

I'm not going to be swayed by anecdotal evidence, nor unsubstantiated belief, no matter how intensely that belief is held.
 
Another thing to keep in mind is display refresh rate is only one factor in total input lag. Going from 120 to 240 Hz may halve your lag from refresh rate, but relative to total input lag the difference will be much smaller.
https://www.anandtech.com/show/2803/7
This is an 10 year old article with mouse having 8ms and CPU another 20ms lag...those times are mainly over.
With a current high clock CPU mouse+CPU would probably be next to nothing.

Now the whole difference from everything when getting 240FPS might be very very small but if you are talking about a 2-3 million prize money in fortnite for example then even a 1% difference is a difference that will make you think about it.
I completely agree that for the general hard core gamer the difference is not important and more of a decision of taste but if you are after it for the money any amount of difference adds up.

And here is an article for everyone that cares about the written word:
https://displaylag.com/benq-zowie-xl2546-review-240hz-gaming-monitor/
I measured Overwatch’s input lag using a 1000 FPS camera at both 60hz and 240hz refresh rate on the XL2546. In-game V-Sync was enabled as well for this test, though DyAc was disabled. Using a LED button mapped to strafe, the time between LED flash and on-screen action was measured 10 times, with the average of 10 results being used as our final number. At 1000FPS, 1 frame equals 1 millisecond:

  • Overwatch 60hz Average Input Lag: 76ms
  • Overwatch 240hz Average Input Lag: 24ms
Holy cow, what a difference! Running the game at 240hz results in 52ms of input lag being shaved off when compared to 60hz, which is extremely significant in my opinion.
 
...
Just like "King V" I also would like to see reliable high level research evaluating gaming performance differences over 60fps by experienced high level gamers to see the point of diminishing returns.

I'd like to see a simple competition where high-performing 'champion' gamers are arrayed against mid-tier gamers. The high-performing gamers would be 'gimped' with an array of lower refresh/FPS systems and the mid-to-low performing gamers given high performing systems. It would be in blind match-ups with opponents in separate rooms (therefore not able to interact outside the virtual arena) and not knowing anything about their systems before hand, nor even know the true purpose of the competition.

Not only would the competition outcomes be informative, but the player reactions (gauged with standardized questionaires they'd fill out) after and during the competitions would be too.

Of course such a test would be a bit costly to arrange, and doubtless well beyond most TechTuber's means I'm sure. But if concerned companies are truly interested in knowing how FPS/Refresh affects actual competition they'd find resources to sponsor one.
 
Last edited:
  • Like
Reactions: Gurg and King_V
This is an 10 year old article with mouse having 8ms and CPU another 20ms lag...those times are mainly over.
With a current high clock CPU mouse+CPU would probably be next to nothing.
Well, from your source: "The fastest console games tend to hover around 60ms of input lag, with many modern engines taking over 100ms before the button is displayed on screen!"

Not sure if they're referring to "modern engines" running on PCs, consoles, or both though.

But if you scroll a bit further down the page in my link you'll see they have another example with 1 ms mouse time, 2ms CPU time, etc., for a total of 32 ms response time assuming a 60Hz display. So in that example if you were to change your refresh rate to 240 Hz (reducing lag due to screen refresh by 75%), your overall response time is only reduced by 40%. Still significant, but not as dramatic as being reduced by 3/4.

I measured Overwatch’s input lag using a 1000 FPS camera at both 60hz and 240hz refresh rate on the XL2546. In-game V-Sync was enabled as well for this test, though DyAc was disabled. Using a LED button mapped to strafe, the time between LED flash and on-screen action was measured 10 times, with the average of 10 results being used as our final number. At 1000FPS, 1 frame equals 1 millisecond:

  • Overwatch 60hz Average Input Lag: 76ms
  • Overwatch 240hz Average Input Lag: 24ms
Holy cow, what a difference! Running the game at 240hz results in 52ms of input lag being shaved off when compared to 60hz, which is extremely significant in my opinion.
I really question the choice to keep Vsync enabled for that test. They're not just comparing 240 Hz vs 60 Hz, they're also comparing 240 FPS vs 60 FPS. Running 240 FPS on a 60 Hz display would still lower average input response compared to 60 FPS. And anyone concerned about input lag is going to be running with Vsync off anyway.
 
  • Like
Reactions: nikolajj
I really question the choice to keep Vsync enabled for that test. They're not just comparing 240 Hz vs 60 Hz, they're also comparing 240 FPS vs 60 FPS. Running 240 FPS on a 60 Hz display would still lower average input response compared to 60 FPS. And anyone concerned about input lag is going to be running with Vsync off anyway.
Keeping Vsync on will prevent jitter and screen tearing, both of those can really mess with your targeting.
 
Fair enough, I don't play first person shooters. I thought I remembered hearing that Vsync off was the preferred option for competitive fast paced gaming because you wanted the absolute lowest input lag, maybe that's not the case. Although enabling Vsync seems like it would increase jitter if anything, not reduce it.

Regardless, even if you wanted to test with some sort of Vsync on, I'd much rather see a comparison with triple buffering enabled (or Enhanced Sync/Fastsync, which I believe are AMD/Nvidia's names for their own flavors of Vsync implementing something similar to triple buffering). Especially in cases like this where the test system is capable of outputting FPS multiple times greater than the refresh rate (which is something of an ideal case for triple buffering), which should reduce input lag significantly.
 
  • Like
Reactions: nikolajj
The Youtube channel? Linus Tech Tips' YouTube channel, you mean? That is the video we're talking about, right? You ARE aware that they have an actual web site, yes? Granted, it's a forum, but if there was something substantial, it would likely be there, no? And, perhaps someone who's defending the "higher refresh rates can be seen by the human eye" position would know and have read it, if such a post existed.

So, no, I'm not saying that a video is necessarily invalid, but that's not the ideal go-to for something that's being used as scientific evidence. And, even then, any old article is meaningless in and of itself. A writeup of their full testing methodology.

But if there is no write up, then no, I'm not wasting my time. I'm not going to slog through a lot of chatter because nobody can be bothered to provide solid evidence with reproducible testing methodology.

I'm not going to be swayed by anecdotal evidence, nor unsubstantiated belief, no matter how intensely that belief is held.

They have a forum, yes. For the viewers to use. They don't do articles themselves.
As a youtube channel, they earn money based on views. Why on earth would they give ANY form of alternative to watching the video?

It is okay if you can't be bothered to look through the evidence. But if you don't, then you can't keep arguing!
 
Maybe you're just not clear on what the scientific process is. Or possibly, wishful thinking means you don't want the clarity - since all you're willing to go by is a single YouTube video as the source of authority.

No thanks - I prefer scientific rigor.
 
Maybe you're just not clear on what the scientific process is. Or possibly, wishful thinking means you don't want the clarity - since all you're willing to go by is a single YouTube video as the source of authority.

No thanks - I prefer scientific rigor.
I know what the scientific process is. And it has nothing to do with what medium you use to present your findings.
It baffels me that you stick to criticize that one source specifically while also refusing to go through it.
 
I think there's a few points here to the skeptical:
1.) Ultra fast reflex pro players are going to be more sensitive to latency. Sure, a lot of delusional people will believe themselves to be in this category and throw money at expensive cards and monitors that won't help their game, but c'est la vi.
2.) I can say with certainty that a jump from 60-120fps is noticeable even for non-competitive older players such as myself so there's nothing wrong with the push for higher FPS past 60. Being able to see detail in low light and shadow covered areas is equally important for reaction time for my eyes however. I wouldn't have high refresh and craptastic color depth though so HDR I think will benefit more people than crazy high FPS.
3.) At some point the laws of diminishing return kick in even for the most extreme gamers out there. I'm guessing around 160hz or so. I just don't see human brains noticing a couple ms delay although I'd be really interested to see someone study it and prove me wrong.

I for one hope gamers start to obsess less over refresh and more over color depth and accuracy. I really want an affordable gsync HDR QHD monitor to come around.
 
I think there's a few points here to the skeptical:
1.) Ultra fast reflex pro players are going to be more sensitive to latency. Sure, a lot of delusional people will believe themselves to be in this category and throw money at expensive cards and monitors that won't help their game, but c'est la vi.
2.) I can say with certainty that a jump from 60-120fps is noticeable even for non-competitive older players such as myself so there's nothing wrong with the push for higher FPS past 60. Being able to see detail in low light and shadow covered areas is equally important for reaction time for my eyes however. I wouldn't have high refresh and craptastic color depth though so HDR I think will benefit more people than crazy high FPS.
3.) At some point the laws of diminishing return kick in even for the most extreme gamers out there. I'm guessing around 160hz or so. I just don't see human brains noticing a couple ms delay although I'd be really interested to see someone study it and prove me wrong.

I for one hope gamers start to obsess less over refresh and more over color depth and accuracy. I really want an affordable gsync HDR QHD monitor to come around.
Totally agree!
More frames are better, but at 160+ the benefit becomes so small that it makes no sense for most people. The focus should shift to color until it has caught up to refresh rate in diminishing returns. After that, they can improve either one so we can have more for the same price.

4K resolution is also all we need for the foreseeable future.