Question Freesync enabled, yet screen tear?

nbartolo7

Commendable
Sep 4, 2017
47
1
1,535
0
Hi everyone,

I have a freesync display and an amd radeon card. I activated freesync on the display and radeon adrenalin software recognized that i did so, and it is also checked/activated there. However, why is my framerate not capped? Isn't that what freesync is supposed to do? Also, I could swear, I got some tearing in one of my last games.

Any ideas? Thanks
 
Last edited:
No. Capping frames is what Vsync does, Freesynce minimizes tearing while leaving the frame-rate uncapped. Also most monitors have a freesynce range in which it works, dip below that frame-rate and you can get some bad tearing, its usually 50-70 FPS on 144hz monitors.
 
Reactions: nbartolo7

nbartolo7

Commendable
Sep 4, 2017
47
1
1,535
0
No. Capping frames is what Vsync does, Freesynce minimizes tearing while leaving the frame-rate uncapped. Also most monitors have a freesynce range in which it works, dip below that frame-rate and you can get some bad tearing, its usually 50-70 FPS on 144hz monitors.
Alright, thanks for info.
 

QwerkyPengwen

Dignified
Ambassador
to clarify further to help you understand the difference:
V-Sync isn't just for screen capping, it is to remove tearing by forcing the GPU to slow down and spit out only complete frames in sync with the monitor's refresh rate, which inherently also caps the frame rate to the monitor's refresh rate, or an even division of it (i.e. going from 60 to 30 frames per second)

FreeSync is a form of "adaptive sync" where instead, some extra hardware is used inside the monitor to force the monitor to "adapt" it's refresh rate to that of whatever the GPU is spitting out, however, as stated, it works within a range where there's a minimum and a maximum.

Going below the minimum that the monitor is rated for will result in screen tearing, but going over it can also do the same, just not as noticeable.

Usually FreeSync and G-Sync have standard minimums of about 50fps, but some can be lower. The minimums are never usually over 60 since 60 is the standard default refresh rate of any monitor or TV so at the very least it would be 60 as the minimum but is usually a bit lower than that in order to make sure that the adaptive sync works when playing at an average frame rate of 60fps.

The reason for adaptive sync's existence and it's superiority over software sync methods of old like V-Sync is like this:
When using software sync like V-Sync, it forces the GPU to hold up and wait a minute.
This introduces delays as well as the discarding of some frames which introduces latency for things like inputs.
But could also introduce problems when getting discarded frames, that's when double and triple buffering came into existence, where the GPU will hold onto extra frames that it has created while waiting to send one to the monitor, and therefore has the ability to discard a frame if it waits too long but still have another in waiting to be able to use. But the end result is that things are syncronized and therefore you don't get any screen tearing because only full and complete frames are being displayed at any given time.

(Keep in mind, this is assuming that you are running a frame rate higher than the refresh rate of a monitor, for example, on a normal 60hz display, you have to be running more than 60fps for V-Sync to remove screen tearing. If you run lower than 60fps, it will drop your frame rate to the next sync level which can be either 45 or 30fps in that moment until the frame rate jumps back up, this is how you get those harsh frame rate drops of old that caused lag and stuttering)

With adaptive sync, because you are making the display adapt itself to the frame rate of the GPU, there are no discarded frames, and frames are just sent out willy nilly like when you don't have V-Sync enabled on a normal monitor, but unlike on a normal monitor, because the display is adapting, it stays synced with the frames and thus you get reduced or completely removed screen tearing, and because the GPU isn't being forced to wait to send out frames, you don't get nearly as much input latency, however there is still just the slightest amount of input latency because there is still a brief moment where the monitor and the GPU have to communicate in order to sync up.

And with adaptive sync doing it's thing, it won't cause that frame rate dropping thing I just mentioned in the parenthesis above.

If you are running frame rates higher than the refresh rate of your display, and want to cap the frame rate to run no higher so that FreeSync does it's job, there are a couple of methods.

Method one, use in game frame rate limiter.
If no limiter is built into a game, then move on to method 2.

Method 2, play the game in Borderless Window mode.
Playing a game in borderless window mode will 99% of the time limit the fps to the refresh rate of the desktop which just so happens to be the refresh rate of the monitor.

Method 3, use software to limit the frame rate of a game.
Software like RivaTuner Statistics Server which you can get solo, or packaged with MSI Afterburner, has a built in frame rate limiter that you can apply to a game of your choosing.
 

poorbugger

Reputable
Nov 28, 2015
491
67
4,790
11
Well you have to
  1. Go into radeon software the adrenalin and enable enhanced sync.
  2. Then disable vsync ingame.
  3. Using rivatuner which comes with the msi aftburner. This step depends on your monitor refresh rate. If your monitor refresh rate is 60hz, set the framerate limit to 58. If it's 75hz set it to 73 fps.
 

ASK THE COMMUNITY