And everyone seems to forget that consoles were 29.9hz...
The standard recommendation for computers during CRT days was to keep things 72Hz or higher, with 85Hz preferred. 60Hz resulted in an awful flicker due to the way the electron gun scanned the phosphors.And tube TVs, Hz wasn't really an issue till slow liquid crystal was a display.
I haven't found an option to limit fps on neither MSI Afterburner nor Nvidia CP. On nVidia CP you can only turn VSync on of off. Windows 10 and 11's version of nVidia CP probably have that option but I haven't found a way yet of making my games work properly on either.NvidiaCP has an fps limiter built in but not sure if Win 7 driver version includes it? If it does it'll be in manage 3D settings. Riva Tuner and Msi Afterburner can limit fps too if not.
Which games don't run on 10? I don't have issues like that with variety of old games. Maybe you need to install DX runtimes. https://www.microsoft.com/en-au/download/details.aspx?id=35
And tube TVs, Hz wasn't really an issue till slow liquid crystal was a display.
A screen update is a screen update is a screen update.
I'm just pointing out that most of the FPS-bro stuff is in people's heads.
Our eyes don't see frames or scan lines, in fact most of what you think you see is actually your brain filling in details based on what was previously seen.
Motion is enhanced though as you get above 85fps. L4D for example, playing at 144fps is totally different experience compared to 60, it's just a lot smoother. There's also more to playing games with your eyesight, peripherals are more on point. Probably why CS guys are the way they are.
That's interesting. Due to temperature issues. I use vsync with triple buffering on, because nvidia control panel says "triple buffering improves performance when vsync is on". The only game I've noticed is quite laggySo while double buffering is probably something we want to keep, triple buffering is something everyone should avoid at all costs.
When I read that it makes me scratch my head. It must be your settings. If there was any game that surprised me that plays like butter is Crysis 2 original and Remastered on my old GTX 970 and or GTX 690. Your GTX 1060 is on par with those old cards if not better.The only game I've noticed is quite laggy
right now is Crysis 2, the only FPS game I really like. This lag, apparently, only affects mouse input, and only in
FPS games.
That's interesting. Due to temperature issues. I use vsync with triple buffering on, because nvidia control panel says "triple buffering improves performance when vsync is on". The only game I've noticed is quite laggy
right now is Crysis 2, the only FPS game I really like. This lag, apparently, only affects mouse input, and only in
FPS games. I also play ETS2 a lot with the mouse and so far I haven't noticed any mouse input lag on it.
I have a couple of racing simulators which I also play with my mouse and vsync on and I haven't really noticed mouse input lag on them. It seems to me that only FPS games are really affected by mouse input lag at lower fps.
I will try using vsync coupled with double buffering instead of triple buffering and see if it improves mouse input lag.
PS: At least on Win7, nvidia CP only allows triple buffering either on or off. I am turning it off and will see if it
improves mouse input lag in Crysis 2.
i have all Crysis 2 graphical settings on Ultra and on nvidia cp it's mostly focused on graphics quality instead of performance.When I read that it makes me scratch my head. It must be your settings. If there was any game that surprised me that plays like butter is Crysis 2 original and Remastered on my old GTX 970 and or GTX 690. Your GTX 1060 is on par with those old cards if not better.
IDK maybe you have more eye candy turned on in Nvidia control panel and or games control panel. The game looks beautiful even with lower demands.
Bump it down just one and it will play better and funny look just as good.i have all Crysis 2 graphical settings on Ultra
Higher frame rates/display Hz will be most noticeable with small, fast-moving objects, and a mouse cursor is a perfect example. If anything, you're more likely to notice a difference with mouse movement than with any game.My secondary screen is still 60hz and even just moving the mouse in windows is obviously different between primary (144hz) and secondary.
NVIDIA's control panel looks like it hasn't been redesigned in 20 years, so I would be very surprised if the Windows 7 version did not have a frame rate limiter. Under "Manage 3D settings," under "Global Settings," is there no "Max Frame Rate option?I haven't found an option to limit fps on neither MSI Afterburner nor Nvidia CP. On nVidia CP you can only turn VSync on of off. Windows 10 and 11's version of nVidia CP probably have that option but I haven't found a way yet of making my games work properly on either.
I like the idea of having a high refresh rate monitor, but I'm worried that if I eventually got one I'd not even notice much of a difference in games, like your experience.I bought a gaming monitor when my 60hz one died. I can't say that I have noticed a difference, perhaps I am too old now.
Huh? I can very clearly see the mouse cursor skipping as it moves on my 75 Hz monitor. Meaning I'm perceiving individual signals above 60 Hz. And the fact that the skipping is very noticeable means I'd likely be able to perceive the skipping at vastly higher frame rates, too. The appearance of fluid motion is very relative to the speed of movement of any individual object when it comes to frame rate.At 60hz your screen refresh time is 16.6 milliseconds, congrats your already below the brains ability to distinguish individual signals. At 84hz you are at 11.9ms, at 100 we're getting 10ms then 144hz it starts getting really silly at 6.9ms. The appearance of fluid motion happens at around 41~50ms.
Errrrrrrrrrrrrrrrrrrrr... huh? Maybe everyone's brains are different, but I can very clearly see the mouse cursor skipping as it moves on my 75 Hz monitor. Meaning I'm perceiving individual signals above 60 Hz. And the fact that the skipping is very noticeable means I'd likely be able to perceive the skipping at vastly higher frame rates, too. The appearance of fluid motion is very relative to the speed of movement of any individual object when it comes to frame rate.
I guess by individual signals you're referring to the point where a sequence of images stops being perceived as such and start being perceived as movement. However, as above, this would seem to depend on the speed of movement. The mouse cursor looks like it's moving on my screen if I move it slowly, but once I start to move it too fast the illusion of movement is cracked as my brain perceives the individual images of the mouse as it jumps from one location on the screen to the next.
I never said they did. I was addressing your claim that 16.6 ms is below the brain's ability to distinguish individual signals.Human brains do not process in frames but in motion, contrast and change, this processing is be different based on which part of the eye is doing the receiving.
How so? The mouse position is updated in sync with the refresh rate. When moved, it's in a different place with each display refresh.Moving a mouse across the screen is more about how the OS handles it and not the refresh rate
I'm not sure what you're getting at here. I'm certainly seeing the mouse cursor for any definition of "seeing" that is useful.in actuality your not even seeing the mouse, instead your brain is just filling it what it expects the mouse cursor to do.
That's fine, though I'm not trying to argue that you can't enjoy video games unless you're playing at 240+ FPS. I think 60 FPS is reasonable and it's all diminishing returns the higher you go.If you actually want to understand what is going on then reread the long post, otherwise I'll just lump you in with the FPS-Bro category.
I think palladin wrote a very good explanation and it directly answers your question. Even if you disagree, it's worth your time to read the post. Esp. the double buffering and triple buffering side effects.That's fine, though I'm not trying to argue that you can't enjoy video games unless you're playing at 240+ FPS. I think 60 FPS is reasonable and it's all diminishing returns the higher you go.
Also, I apologize if I seemed too combative or argumentative.
In my example, the screen displays a new frame 75 times a second. When moving the mouse, for each frame the cursor is in a different place, for 75 different signals a second. Moving the mouse sufficiently fast in a circle, I can perceive each individual location distinctly.
Again your brain does not see frames. You are not seeing 75 mouse images every second.
That's not what he is saying. I respect your insights but ya getting ahead of yourself here. he is saying he is seeing heightened level of motion not individual frames, myself as well, when i joked about CS guys, i wasn't talking about being able to distinguish individual frames either but it's the rest of it that comes with higher refresh rate allowing for faster motion and it is smoother from me experience. Do you play games or experienced high refresh rates in games? I mean, if you haven't then what's the point debating this.
I didn't read all of the initial post I responded to because there was no point, because I reject its initial premise, which is that greater than 60 Hz frame rates do not increase perceived visual motion smoothness, and that it instead has to do with input lag. I think the post is undoubtedly worthwhile for explaining aspects of input lag.I think palladin wrote a very good explanation and it directly answers your question. Even if you disagree, it's worth your time to read the post. Esp. the double buffering and triple buffering side effects.
I agree about the mouse movement not being comparable to gaming, and said as much in my initial post.You definetly notice the mouse cursor skipping but it isn't solely to do with refresh rates. Also, moving the mouse across the static desktop is far from a gaming situation, but i digress.
My brain does not process visual information in frames like a camera, but that does not mean it is not able to perceive discrete frames of movement.Again your brain does not see frames. You are not seeing 75 mouse images every second.
This is the problem every FPS Bro has, their understanding of images is digital but humans are analogue. Every pixel on that screen is sending out photons, those photons are striking the back of the retina and activating photo-receptors that emit a signal down the nerves into the brain where those signals are put together. That signal is analogue and each type of cell emits a different signal for color and intensity. During processing much of this information is discarded as the brain will stich patterns together and that is what you "see". This is how we can do optical illusions and hidden images by taking advantage of how the brain choses what to process and what to discard. Furthermore the brain will frequently fill in gaps by inserting past patterns into what you see as a shortcut.
To illustrate take that white mouse on a dark background. As you move the mouse across the pixels will go from emitting low intensity photons to high intensity photons, the cells in the eye go from sending low amplitude waves to high amplitude waves. The brain recognizes the significant change in amplitude as signaling "something changed" and focuses on processing that information instead of using previously processed information. Now if it's a dark blue mouse on an almost dark blue background, now the change in intensity of those photons are going to be far less, resulting in the retina cells change in amplitude being smaller and the brain starts frequently discarding that information and you lose track and find it "hard to see". This is something doctors are dealing with now as they have successfully built a bionic eye and restored vision to blind people.
In regards to video gaming, since that was the whole point of this tangent, higher FPS will only help if the frame to frame difference is high contrast, otherwise it'll get washed away. The brain doesn't see frames from eyeballs, but a continuous stream of information is sent over the optical nerve to be processed. Plus object recognition happens much slower then pattern change recognition. It's fascinating to study but is very different from how we display information on a screen.
Well, I was actually saying that I did perceive individual frames once movement was fast enough. But not that my eyes have a frame rate like a camera. The point being that higher frame rates benefit more the faster an object is moving.That's not what he is saying. I respect your insights but ya getting ahead of yourself here. he is saying he is seeing heightened level of motion not individual frames
I'm almost certain that you would, if as you said before, the contrast were high enough. If you were staring at an entirely green screen and the 138th frame had a red apple in it, there's no way it would be missed. You probably wouldn't be able to see it long enough to tell that it was an apple, but you'd perceive the change, a red flash. If I had a 144 Hz screen, I could code a little game that would let me test this, but sadly I don't. Mine only goes up to 75 Hz.If you screen is processing 144 frames, would you even know if the 138th one had a red apple in it for exactly one frame?
I agree. But I don't agree that the only difference in higher frame rates is input lag.Boiling it all down to answer the question of "Is 120FPS much better than 60FPS in games?", the answer is generally no.
Well, I was actually saying that I did perceive individual frames once movement was fast enough. But not that my eyes have a frame rate like a camera. The point being that higher frame rates benefit more the faster an object is moving.