It's the age-old problem of benchmarks. If sites make charts that only 5% of readers can understand, that's not a great solution. Putting in five charts instead of one doesn't work great either. People (well, most people...) want concise representations of data, not minutia overload. If you
only plot frametimes, you still end up with the same problems as
only plotting framerates. They're just reciprocals of each other: Frames per second versus seconds per frame. (FPS = 1000 / frametime in ms. Frametime in ms = 1000 / FPS.)
While we're on the subject, there's potential value in doing logarithmic scales of FPS. Because going from 30 to 60 is a much bigger deal than going from 60 to 120, or from 60 to 90. But again, there's nuance to discuss and more people will likely get confused by logarithmic charts. And what base do you use for these logarithmic charts? Because that can change the appearance of the graphs as well.
Measuring latency is helpful and good, but also doesn't tell the full story regardless. It's why all of my reviews for the past couple of years also included a table at the end that showed average FPS, average 1% lows, price, power, and — where reported by FrameView — latency. Like on the 9070 / 9070 XT article, bottom of page 8:
https://www.tomshardware.com/pc-components/gpus/amd-radeon-rx-9070-xt-review/8
(I didn't include that on the 9060 XT review because, frankly, I was no longer a full-time TH editor and didn't want to spend a bunch more time researching and updating prices, generating more charts, etc. 🤷♂️ )
It all ends up as somewhat fuzzy math. Some people will want sub-40ms for a "great" experience and are okay with 100 FPS as long as it gets them there. Others will be fine with sub-60ms or even sub-100ms with modest framerates. I will say, as noted in many reviews and articles, that MFG and framegen end up being hard to fully quantify. Alan Wake 2 can feel just fine with a lower base framerate, with MFG smoothing things out, and relatively high latency. Shooters generally need much lower latency to feel decent. Other games can fall in between.
And engines and other code really matter a lot. Unreal Engine 5 seems to have some really questionable stuff going on under the hood, and it's used in a lot of games. I know there was one game I was poking at (A Quiet Place: The Road Ahead) where even a "native" (non-MFG/non-FG, non-upscaled) 40~45 FPS actually felt very sluggish. Like, if I were to just play the game that way without looking at the performance or settings, I would have sworn that framegen was enabled and that the game was rendering at ~20 FPS and doubling that to 40. But it wasn't, the engine and game combination just ended up feeling very sluggish. Enabling framegen actually made the game feel much better! Weird stuff, but that's par for the course with UE5.