Input latency is the all-too-frequently missing piece of framegen-enhanced gaming performance analysis

Over a decade on from Inside the Second, and it seems the same lessons in GPU performance measurement have to keep being relearned time and time again. We've known for at least that long that measuring, or even plotting, FPS tells you only a fraction of the story on GPU performance.

Thou shalt plot frametimes, not framerates!
 
  • Like
Reactions: Penzi
Yes, measure the time from a key press or a mouse click to in-game response and showing it on the screen. But this would not be only about the rendering pipeline, but also about the game engine itself (which can definitely make things even worse, beyond just a poor GPU/driver lag).
 
  • Like
Reactions: ianbalgas
Over a decade on from Inside the Second, and it seems the same lessons in GPU performance measurement have to keep being relearned time and time again. We've known for at least that long that measuring, or even plotting, FPS tells you only a fraction of the story on GPU performance.

Thou shalt plot frametimes, not framerates!
It's the age-old problem of benchmarks. If sites make charts that only 5% of readers can understand, that's not a great solution. Putting in five charts instead of one doesn't work great either. People (well, most people...) want concise representations of data, not minutia overload. If you only plot frametimes, you still end up with the same problems as only plotting framerates. They're just reciprocals of each other: Frames per second versus seconds per frame. (FPS = 1000 / frametime in ms. Frametime in ms = 1000 / FPS.)

While we're on the subject, there's potential value in doing logarithmic scales of FPS. Because going from 30 to 60 is a much bigger deal than going from 60 to 120, or from 60 to 90. But again, there's nuance to discuss and more people will likely get confused by logarithmic charts. And what base do you use for these logarithmic charts? Because that can change the appearance of the graphs as well.

Measuring latency is helpful and good, but also doesn't tell the full story regardless. It's why all of my reviews for the past couple of years also included a table at the end that showed average FPS, average 1% lows, price, power, and — where reported by FrameView — latency. Like on the 9070 / 9070 XT article, bottom of page 8:
https://www.tomshardware.com/pc-components/gpus/amd-radeon-rx-9070-xt-review/8

(I didn't include that on the 9060 XT review because, frankly, I was no longer a full-time TH editor and didn't want to spend a bunch more time researching and updating prices, generating more charts, etc. 🤷‍♂️ )

It all ends up as somewhat fuzzy math. Some people will want sub-40ms for a "great" experience and are okay with 100 FPS as long as it gets them there. Others will be fine with sub-60ms or even sub-100ms with modest framerates. I will say, as noted in many reviews and articles, that MFG and framegen end up being hard to fully quantify. Alan Wake 2 can feel just fine with a lower base framerate, with MFG smoothing things out, and relatively high latency. Shooters generally need much lower latency to feel decent. Other games can fall in between.

And engines and other code really matter a lot. Unreal Engine 5 seems to have some really questionable stuff going on under the hood, and it's used in a lot of games. I know there was one game I was poking at (A Quiet Place: The Road Ahead) where even a "native" (non-MFG/non-FG, non-upscaled) 40~45 FPS actually felt very sluggish. Like, if I were to just play the game that way without looking at the performance or settings, I would have sworn that framegen was enabled and that the game was rendering at ~20 FPS and doubling that to 40. But it wasn't, the engine and game combination just ended up feeling very sluggish. Enabling framegen actually made the game feel much better! Weird stuff, but that's par for the course with UE5.
 
MFG "might" work on high end, but it shouldn't be used on any low end system.

Just like how DLSS was meant to let lower end systems run games better it was abused and is now expected for a min 60fps on those low systems.

FrameGen already shows us that devs will abuse any tech to cheap out on optimizing games and have "framegen on to get 60fps" as their min requirements.

This will be case with MFG if it gets adopted by both sides. It will make gaming an even more awful experience for many.
 
  • Like
Reactions: ianbalgas
It's the age-old problem of benchmarks. If sites make charts that only 5% of readers can understand, that's not a great solution. Putting in five charts instead of one doesn't work great either. People (well, most people...) want concise representations of data, not minutia overload. If you only plot frametimes, you still end up with the same problems as only plotting framerates. They're just reciprocals of each other: Frames per second versus seconds per frame. (FPS = 1000 / frametime in ms. Frametime in ms = 1000 / FPS.)

While we're on the subject, there's potential value in doing logarithmic scales of FPS. Because going from 30 to 60 is a much bigger deal than going from 60 to 120, or from 60 to 90. But again, there's nuance to discuss and more people will likely get confused by logarithmic charts. And what base do you use for these logarithmic charts? Because that can change the appearance of the graphs as well.

Measuring latency is helpful and good, but also doesn't tell the full story regardless. It's why all of my reviews for the past couple of years also included a table at the end that showed average FPS, average 1% lows, price, power, and — where reported by FrameView — latency. Like on the 9070 / 9070 XT article, bottom of page 8:
https://www.tomshardware.com/pc-components/gpus/amd-radeon-rx-9070-xt-review/8

(I didn't include that on the 9060 XT review because, frankly, I was no longer a full-time TH editor and didn't want to spend a bunch more time researching and updating prices, generating more charts, etc. 🤷‍♂️ )

It all ends up as somewhat fuzzy math. Some people will want sub-40ms for a "great" experience and are okay with 100 FPS as long as it gets them there. Others will be fine with sub-60ms or even sub-100ms with modest framerates. I will say, as noted in many reviews and articles, that MFG and framegen end up being hard to fully quantify. Alan Wake 2 can feel just fine with a lower base framerate, with MFG smoothing things out, and relatively high latency. Shooters generally need much lower latency to feel decent. Other games can fall in between.

And engines and other code really matter a lot. Unreal Engine 5 seems to have some really questionable stuff going on under the hood, and it's used in a lot of games. I know there was one game I was poking at (A Quiet Place: The Road Ahead) where even a "native" (non-MFG/non-FG, non-upscaled) 40~45 FPS actually felt very sluggish. Like, if I were to just play the game that way without looking at the performance or settings, I would have sworn that framegen was enabled and that the game was rendering at ~20 FPS and doubling that to 40. But it wasn't, the engine and game combination just ended up feeling very sluggish. Enabling framegen actually made the game feel much better! Weird stuff, but that's par for the course with UE5.
You don't have to dive into minutia...just give RENDERED frame rates in reviews and don't use FAKE frames in them at all.
 
  • Like
Reactions: palladin9479
The problem with "AI Generated Frames" is that in moderation they're great, especially when used with adaptive refresh rates (Freesync/G-Sync), doubly so for lower refresh rate monitors (not as many frames needed to hit it), but in the levels that they push for DLSS4 marketing there's just too many to make sense. They should be used to "fill in the gaps" to make a game smooth, not as a crutch to "make the game playable".
 
I tried the Lossless Scaling FG to 60 fps just to handle the occasional dips to 45 fps with a second DGPU on my A750 and it looked really good.

But I was playing Expedition 33 and the variable input lag was killing my ability to dodge and parry.
 
  • Like
Reactions: ianbalgas
Technically, if you desperately need input latency on a scale of 50ms and lower to hold your own against the game, you already don't have any time to look at the image quality and so it can be freely reduced. If you want to look at something beautiful, you need some time to process it :)
 
  • Like
Reactions: I6K0E
The biggest problem with FG (multiplicably so for MFG) is twofold for me: when it goes wrong it's very bad and how usable it is can vary on a game by game basis.

While I'm still not particularly a fan of upscaling I do use it periodically to maintain frame rates. In any game I can turn it on and get what I expect from it with very little deviation. FG on the other hand struggles with dynamic situations and MFG just makes those situations worse. There are also games which have input designs that are simply less latency sensitive so what wouldn't be acceptable in one game may be in another.

With monitor refresh rates showing no sign of slowing down FG can be a great technology to match them. I have a 1440p/240Hz UW display and not even a 5090 can do that outside of older/light weight games. Several cards can do 120 fps at that resolution though so FG would be a great way to bridge that gap.
 
  • Like
Reactions: ianbalgas
If you really want to get at the gameplay effects of input latency, I think you should look at the fields of flying qualities and human systems integration (HSI)/human-machine interface (HMI) applied to aviation.

There are some rough, not fully agreed-upon standards there. Regardless of whether we're talking stick & rudder to aircraft movement or something plotted on a screen, if a control response is related to safety-of-flight, some generic maximum lags/minimum refresh rates are:
  1. 50 ms (20 Hz) is good for all; probably required for fighter jets and attack helos
  2. 100 ms (10 Hz) good enough for transport aircraft and some helicopter stuff
  3. 200 ms (5 Hz) questionable, but accepted for some non-critical tasks
  4. 250 ms (4 Hz) a generally agreed upper threshold for anything control related
Some of the most demanding tasks in an aircraft are pointing the nose or target tracking. I think the gaming corollary is tracking targets in FPS games. As lag increases, you increase the chance of seeing pilot-induced oscillations (PIOs). The effect in FPS would be your reticle oscillating around a target, possibly at increased error over time.

I'd think the ideal benchmarking would be framerate for a sense of visual smoothness (independent from control concerns), and then total system latency vs. a standard maximum for the type of game. If I were spit-balling, I'd go with 50 ms for FPS/driving/flight sims, 100 ms for other fast-paced games (RTS, some adventure), and 200 or 250 ms for others (CIV, turn-based).