Just a thing to mention about the "latency" debacle and something I've also mentioned a thousand times before: human reaction times average at 250ms. The best recorded reaction times are in the 150ms range (fighter pilots and such).
Let that sink for a moment.
After that, consider (as some have mentioned) network latency and server ticks (CS has the highest with 64Hz or 15.6ms per, IIRC;
Ticks Explained). After that then your local latency, when under 30ms (exclusively for inputs), becomes absolutely and completely moot and placebo. At that point what starts counting is thinking ahead (strategy, for short), which is something no amount of hardware advantage can bring.
So what is the realistic "FPS" that guarantees you're there? Well, the panel will matter more when you're over 120FPS, which is frame times of just 8.3 ms for the frame to be rendered. Then 240 FPS is 4.17 ms. Do you guys believe that it matters after you're down to single digits of delay per frame?
Well, after all that, the nuance here is two things: pace (1% lows) and consistency (0.1%). As long as your frame time variance (1% or 0.1% if you want to be mega strict) is not too far away from the average, your latency won't suffer. Any spike caused by the GPU will put you well above any network, input and server tick rate, so for competitive games, if you all want to be super pedantic, should never talk about averages and always talk %1 or .1% lows. What I mentioned before this "nuance" section still applies if instead of averages you talk 1% or .1% lows.
Regards.