What is happening when my HDTV receives more than 60fps?

RipGroove

Honorable
Jan 12, 2013
479
0
10,790
I use a Sony W705b as my PC monitor which I believe in 'Game' mode is 60Hz, so I assume all it can really display is 60fps. I do actually usually use vsync in most games for a steady 60fps but say for example I play Battlefield 3 with vsync is off where my PC outputs 170-200fps, what exactly is happening? By pumping too many frames in to my TV is it having a negative effect, like are frames happening that I'm not actually seeing?
 
This is what is meant by screen tearing. With V-sync off the monitor starts drawing whatever is in the frame buffer at the time of display. When that frame is replaced in the buffer with a new one, it is now drawing that, leaving part of the previous frame that was being drawn on the screen until the next complete refresh cycle.

V-sync forces the card to draw a frame and hold it completely through the monitors refresh cycle. G-Sync and FreeSync operate in the opposite manner to eliminate screen tearing by letting the GPU control the start of the cycle as well as the rate.

If you are inputting three times the number of frames to the monitor you should see three distinct frames with a splits somewhere on the monitor. Since it is happening at 60hz, many people will not notice these irregularities.

Since you are playing an FPS, this can actually slightly reduce your response times, and particularly input lag, so it is not a bad idea. However you are putting a lot of extra stress on your GPU for a minimal difference.
 
That's odd because in some games (BF3 or BF4 for example) I actually get screen tearing with vsync ON, which is why don't have it enabled for these games. Some other games like SQUAD, I definitely need vsync ON as without it I get tearing.
 
If you have v-sync on and your GPU can't output at least 60FPS at that exact moment you get something called runt frames. Basically it takes what is currently in the frame buffer and displays it. This may not be a completely rendered frame, so the previous frame will be refreshed instead.

Every game engine interacts a little differently with GPUs and drivers.
 
I make sure I always get 60fps at absolutely all times in all games I play so not having enough frames for vsync is never a problem.

Even if a game dips to 59fps for just 1 second I can tell right away, I guess that is a trait of using a TV maybe? as there is a game I play that's still in Alpha stages and luckily I can almost always still maintain 60fps but other players are happily playing it with 25-40fps and apparently it's still smooth for them and probably the only difference between me and them (other than PC spec) is that I'm using a TV and they are more than likely using a monitor.

 
TVs do have a lot more processing then monitors in general and their response times are usually much higher then monitors. If your TV offers a gaming mode you should have that on as it will strip a lot of the extra effort a TV does to make broadcasts look smooth.

 
Yes I can, I blind test it and get it every time. If I turn FPS counter off on my main monitor and have afterburner graphs on a second monitor everytime I think the game on my main monitor has dipped to 59 if them look over to my second monitor and check the afterburner FPS graph I'll see that it dipped to 59 for a second. I can get it every single time. As I said though, this may be a lot easier to detect on a TV rather than a monitor which is why I see it so easily. Anything below 60fps for me is actually unplayable.
 
reinstall your graphics drivers. make sure youre not using a beta driver. connect using a different hdmi port on the tv if possible. reset nvidia control panel settings if reinstalling isnt an option.

to clarify, realise that 59 fps is 0.01694 and 60 fps is 0.01666. no one human can tell the difference between the two. something else is wrong here, bud.
 
OK well it must be extremely coincidental that I can see it every single time then look at the graph to confirm it. My GPU drivers are just fine, as is my whole PC. I've been able to tell if a game drops below refresh rate since I've been PC gaming (years and years) on all of my previous PC's when hooked up to all of my previous TV's Why would I make it up? It frustrates the hell out of me because it means I absolutely have to make sure every game I play NEVER drops below refresh rate. Just to add I can actually even see this while watching movies on any TV, I see a lot of jerky movements that are supposed to be smooth, like whenever a camera slowly pans across the screen, I see that as a jerky movement due to it only being 24fps (or whatever it is they film movies in), it's not a gift I'd lie about, it's curse that drives me up the wall.
 
playing back a movie is a completely different issue. there are no variable rates. the word youre looking for is telecine, which is the 3:2 pulldown, and yes, it does cause pans to look bad, and i can personally never go back to watching movies on 60 hz displays, but thats a different matter.

no, you cant tell the difference between 59 and 60 fps, thats not possible. the game you play dont properly use vsync, contact the game devs for more information.
 
what do you mean? please back up claims, preferably with science, rather than trying to prove me wrong, which you cant do. if youre having issues with vsync, contact the game devs, thats the best way to solve this. vsync syncs monitor to gpu, meaning that the monitor wont require a frame at the same time the gpu is copying its frame from the back-buffer to the frame-buffer. why would you ask for help/advise if you dont want to believe whats being said.

no worries, bud. i understand that its frustrating, but if a game is not using vsync properly, then thats on the devs. refresh rate is static, the response time is static. settings could also be wrong in control panel, but not much to go on.

i may sound negative, and i apologize, not my intention to be rude, or anything like that. however, you can test this by enabling adaptive vsync instead. if it doesnt stutter then that eliminates your original concern of the tv being the problem.
 
I don't have an issue with vsync. You seem to have the wrong end of the stick here, all I asked was "does piping more frames than a display can handle have a negative effect?" Then I may have mentioned that in all games on all the PC's I've ever had I can 100% tell when I game dips below the displays refresh rate. I also mentioned that you may be somewhat correct, maybe if I was put in front of two identical TV's both playing the same thing, one at a constant 59fps and one at a constant 60fps that no I may not be able to see the difference BUT I can 100% tell when a game goes from a constant 60fps to 59fps for a second or two. That is 100% fact which I can do with no aid from an FPS counter. If I think the game on my main monitor has dipped to 59fps THEN look at the graph on my second monitor I'll always see that I was right, I see it every time, fact.

Please excuse me while I go set up some 'science' experiments in order to prove something to you.

'Unsubsribed' as this is pointless.
 
your question was answered by eximo. i followed up with "you increase the frequency of screen tearing.". the higher you go above the refresh rate, the more often youll screen tear as youre icnreasing the chance of the monitor to want a frame at the same time the gpu is copying its frame from the backbuffer to the frame buffer, thats about it.