Let's not get ahead of ourselves.
At least not until we get one of these:
Let's not get ahead of ourselves.
I know my examples didn't match yours, that was the point. If the improved performance from high fps/refresh only relates to smoothness of motion when you pan/flick your mouse/etc (as you claim), then logically it seems the examples I described should not have seen an improvement from high fps/refresh. But they did.Neither of those examples particularly matches the one I cited.
It's pretty easy to understand, if you just try to visualize it. In the bit of eSports I've watched, it pretty common for players to flick around their mouse to see what's going on around them. The smoother (i.e. more frames in) the pan, the easier it is for your eyes to track and for your brain to spot items of concern.
I know what jitter is, but it seemed that the way you described it was a bit different than the usual definition. But I realize now I may have misinterpreted you.
I'm actually not really sure how this relates to what I said.Cute, but your conceit is that casual gamers care about framerates as much as serious gamers, and have the same wherewithall and drive to pursue them.
As I'm sure you know, the reason casual gamers are set apart, is that we presume they fit @Phaaze88 's description, and are simply gaming for a bit of fun. To the extent they have a competitive streak, we presume the games they play are older or otherwise targeted towards lower-end hardware, or aren't terribly sensitive to lowish framerates.
Now, what I find interesting about the past few years is that 4k monitors became affordable rather quickly, and have gotten significant uptake by "normies" with otherwise unremarkable hardware. As a result, even many "casual" gamers probably won't get away with integrated graphics.
Because I didn't say it's the "only" consequence, the presence or absence of other benefits is irrelevant to my point.If the improved performance from high fps/refresh only relates to smoothness of motion when you pan/flick your mouse/etc (as you claim), then logically it seems the examples I described should not have seen an improvement from high fps/refresh. But they did.
Not true. Jitter is simply the error between when a sample is expected vs. when it is delivered. If all we're concerned about is smooth motion, then input lag doesn't even enter the equation.jitter and input lag are linked. Every ms between when a frame is finished being rendered contributes to equally to jitter and input lag.
Two dependent variables can be influenced by the same independent variable, without having any dependency relationship between them.If you increase the refresh rate/fps and reduce jitter, you will have also reduced input lag.
In the last paragraph, I moved onto a tangent (about casual gaming vs. hardware requirements, in the age of cheap 4k displays). Perhaps that's what threw you off.I'm actually not really sure how this relates to what I said.
Right. Which is why it's so ridiculous that this article only tested gaming performance at 1080P ...with a RTX 2080 Ti!!The situation being described is incredibly niche. If someone is running an i7 8086K at 4.9 GHz and a GTX 1080, they would not be CPU bottlenecked in the vast majority of cases, even in modern games (except maybe if you're playing at 720p or something silly).
Bear in mind that we tested with an Nvidia GeForce RTX 2080 Ti at 1920x1080 to alleviate graphics-imposed bottlenecks. Differences between our test subjects shrink at higher resolutions.
I would still rather get the 3700X than the 10600K
Same for me, I would go for the 3700X anytime
/s Haven't you heard?!?!Now that intel made the dual core CPU, multithreaded gaming is right around the corner. /sPay more for less? That would be an insanely stupid move. Do you even game?
Yikes! To use the Intel-biased userbenchmark of all things... and to make it worse, I know that CSGO result is wrong, as the Source Engine appears to favor Ryzen.
Benchmarking sites use the Csgo replay feature that records all the inputs of all the players and allows you to render the results afterwards,it's the equivalent of cinebench or something and has nothing to do with actual gameplay,same goes for pubg, lol, fortnite and probably many more.Most of the times if they show videos of it side by side you can see how they are the exact same frame for frame.Yikes! To use the Intel-biased userbenchmark of all things... and to make it worse, I know that CSGO result is wrong, as the Source Engine appears to favor Ryzen.