Question Does G-Sync/Freesync matter as long as your don't go over refresh rate?

Mar 30, 2019
5
0
10
Hey guys,

I recently learned the causes of screen tearing and stutter in games. I understand tearing occurs when the frames go over the refresh rate, stutter happens when it goes under refresh rate when v sync is on. So g sync and free sync eliminates these.

BUT what if you have a GPU thats the exact right amount of horsepower so that it doesnt feed the display more frames than the refresh rate, but still is at least 60fps. Then technically wouldn't you not need adaptive sync technology like free/g sync? Your frames shouldn't fluctuate that much, between 60 and 120 but say all your components are balanced and youre getting the minimal fluctuation possible.

for example, ive an RTX 2080 TI and I easily get 140 frames with v sync off in a game like FarCry 5 on Ultra at 1080p. So if I have a 120hz monitor, the fps exceeds 120 and Ill have tearing. And if I enable v sync Ill get stutter.

So what if say, I get a GPU a little less powerful, one that can still do ultra but with slightly less frames, say 115 max instead of 140. Since it doesnt go over 120, I shouldnt have tearing and thus dont need v sync, so wont have stutter either. Or do I still need adaptive sync technology regardless?

I suppose to sum up my question: is g/free sync still needed even though you don't go over your refresh rate? It was designed to prevent tear and stutter but even if you dont go over your refresh does it just make overall gameplay smoother?

I ordered the samsung RU8000 TV and its one of the first TVs with freesync (currently all freesync TV's are only Samsung) and its 120hz. Currently no TV is over 120hz excluding BFGD.

So Im nervous as to how my RTX card will perform on it, and I even ordered the radeon VII to compare.

Only 12 freesync displays were certified by nvidia as g sync compatible, and i doubt my new TV would be one of them.

Also, playing in 4K is out of the question since currently HDMI 2.0 (obviously TV only has HDMI) only supports 4K @ 60hz, and most games run @ 4K with RTX 2080 TI will easily exceed 60fps. Everyone tells me to just get a computer monitor but I just love the immersion of a TV. Also, I love using my gamepad. If freesync and g sync are not necessary as long as I stay under the fresh rate, I may even consider a projector.

I saw reviews of the radeon VII and it seems to play most games @ 1080p with a 60fps minimum. So if it can max out all my games @ 1080p and freesync really works, I may end up replacing my RTX 2080 TI. I suppose the major lesson here is, more power isnt always better. Its all about stability depending on what display youre using.
 
Hey guys,

I recently learned the causes of screen tearing and stutter in games. I understand tearing occurs when the frames go over the refresh rate, stutter happens when it goes under refresh rate when v sync is on. So g sync and free sync eliminates these.

BUT what if you have a GPU thats the exact right amount of horsepower so that it doesnt feed the display more frames than the refresh rate, but still is at least 60fps. Then technically wouldn't you not need adaptive sync technology like free/g sync? Your frames shouldn't fluctuate that much, between 60 and 120 but say all your components are balanced and youre getting the minimal fluctuation possible.

for example, ive an RTX 2080 TI and I easily get 140 frames with v sync off in a game like FarCry 5 on Ultra at 1080p. So if I have a 120hz monitor, the fps exceeds 120 and Ill have tearing. And if I enable v sync Ill get stutter.

So what if say, I get a GPU a little less powerful, one that can still do ultra but with slightly less frames, say 115 max instead of 140. Since it doesnt go over 120, I shouldnt have tearing and thus dont need v sync, so wont have stutter either. Or do I still need adaptive sync technology regardless?

I suppose to sum up my question: is g/free sync still needed even though you don't go over your refresh rate? It was designed to prevent tear and stutter but even if you dont go over your refresh does it just make overall gameplay smoother?

I ordered the samsung RU8000 TV and its one of the first TVs with freesync (currently all freesync TV's are only Samsung) and its 120hz. Currently no TV is over 120hz excluding BFGD.

So Im nervous as to how my RTX card will perform on it, and I even ordered the radeon VII to compare.

Only 12 freesync displays were certified by nvidia as g sync compatible, and i doubt my new TV would be one of them.

Also, playing in 4K is out of the question since currently HDMI 2.0 (obviously TV only has HDMI) only supports 4K @ 60hz, and most games run @ 4K with RTX 2080 TI will easily exceed 60fps. Everyone tells me to just get a computer monitor but I just love the immersion of a TV. Also, I love using my gamepad. If freesync and g sync are not necessary as long as I stay under the fresh rate, I may even consider a projector.

I saw reviews of the radeon VII and it seems to play most games @ 1080p with a 60fps minimum. So if it can max out all my games @ 1080p and freesync really works, I may end up replacing my RTX 2080 TI. I suppose the major lesson here is, more power isnt always better. Its all about stability depending on what display youre using.
First, G-Sync/FreeSync only operate when your framerate is less than your monitor's maximum refresh rate. If your framerate goes higher, G-Sync/FreeSync will turn off.

Secondly, tearing occurs when your framerate is not synchronized with your monitor. It can happen when your framerate is lower than your monitor's refresh rate, or when it is higher, or even if it is exactly the same, if that were to happen in theory. I don't know why people say "tearing only happens when you go above the refresh rate", that is a myth.
 
  • Like
Reactions: Wendigo
Mar 30, 2019
5
0
10
Interesting. I thought it was the opposite. I thought G-sync and freesync were like v-sync but better. And v sync is supposed to turn on when max refresh is exceeded, and turn off when below it.

I saw a video Linus did about a freesync TV, and how even though its enabled with an AMD card, you still must have a small fluctuation of frames for it to work. I think he said for 4K the fluctuation is less than 20 frames while for 1080p its wider. Anything outside of the range will cause stutter and tear.

Since I've a RTX 2080 TI Im pretty sure Im going to get really bad tearing since my frames will definitely exceed 120hz. I'll let you know what the test results are.
 

TJ Hooker

Titan
Ambassador
In theory you could do 4K/120Hz with HDMI 2.0 with 4:2:0 chroma subsampling.

Otherwise, if you're gaming at 1080p you could try using fast sync. It should have less input lag and stuttering than regular vsync while still eliminating tearing. Or game at 1440p, which makes it more likely that you'd stay in the variable refresh rate range.
 
Mar 30, 2019
5
0
10
With my current Samsung TV doing 1440p only allows 30hz. I dont know how to change that. Also, I thought 4K @120 wasn't possible at all with HDMI 2.0? I've no idea about color settings. 4:2:0 refers to colors correct? I don't see anything on my TV that says chroma. My current model is the MU6490 but the one I'm ordering is the RU8000.

Another interesting point is Samsung says on their website the RU8000 is the first Freesync TV, yet their entire QLED lineup along with some premium UHD's from 2017 and 2018 have Freesync.

Yeah I tried fast sync with my TV via Nvidia control panel, doesn't work well. I ordered the freesync TV a while back but kept pushing the delivery date until I was able to get a radeon VII, which I picked up last night. So I should be able to report test results on Tuesday the 9th.
 
FreeSync/GSync are variable refresh rate technologies. The monitor refreshes itself only when a new frame is delivered from the GPU. Frame rate consistency is MUCH better now than it was 5+ years ago, but there's still variance. That probably will never completely go away. Even if you got that perfect combination of resolution/ GPU power/ in-game settings to hit 60fps average, the reality is that the frame times are bouncing around above and below that point all the time.

Different panels have different ranges of refresh rates. 40-75Hz, 48-144Hz, 30-144Hz, etc etc etc. So that needs to be taken into consideration.

Different resolutions may incur different refresh rate ranges due to bandwidth restrictions.

AMD driver software has built-in ability to cap frame rates by lowering the frequency of the GPU to save power ("Frame Rate Target Control" and "Chill"). I think there are 3rd party apps for Nvidia GPUs that do the same. There's no point in generating frames faster than your monitor can display them. Traditional VSync is an abstract software implementation of this, it lets the GPU crank out as many frames as it wants to, but throws all the excess ones away.
 
Last edited:

TJ Hooker

Titan
Ambassador
Traditional VSync is an abstract software implementation of this, it lets the GPU crank out as many frames as it wants to, but throws all the excess ones away.
This is only the case if triple buffering is enabled, which I believe is more or less what Fast Sync does. Otherwise the GPU has to sit and wait for the refresh, which is one of the reasons why normal Vsync can result in noticeable input lag and/or stuttering.
 

Wendigo

Distinguished
Nov 26, 2002
133
35
18,620
When your FPS is below vsync, with vsync disabled, you still get microstuttering. Some frame will have to be displayed twice to make up the missing frames. It's not as obvious as the stuttering from vsync, but it's quite obvious when you're used to play on a variable sync monitor and then goes back to fixed sync one.

Freesync/Gsync work best when the FPS stays inside the variable sync range. Inside this range, the FPS and sync will be perfectly synchronized without any stuttering or shearing.
 

TJ Hooker

Titan
Ambassador
When your FPS is below vsync, with vsync disabled, you still get microstuttering. Some frame will have to be displayed twice to make up the missing frames. It's not as obvious as the stuttering from vsync, but it's quite obvious when you're used to play on a variable sync monitor and then goes back to fixed sync one.
Huh? If you have vsync disabled and you go below your monitor's refresh rate (or above it) you get tearing, not stuttering. The only reason it would repeat an entire frame twice is if your FPS was half of refresh rate or less. It might seem stuttery if you're getting super low fps, but it's still displaying frames as fast as the GPU can render them so not even variable refresh rate would really help in that case.
 

Wendigo

Distinguished
Nov 26, 2002
133
35
18,620
Huh? If you have vsync disabled and you go below your monitor's refresh rate (or above it) you get tearing, not stuttering. The only reason it would repeat an entire frame twice is if your FPS was half of refresh rate or less. It might seem stuttery if you're getting super low fps, but it's still displaying frames as fast as the GPU can render them so not even variable refresh rate would really help in that case.
When below the refresh rate, it will still displays some frame twice (but not all or most as with Vsync on). It's the only way the monitor can fill up the missing frames if the FPS is lower than the refresh rate. Sure, it's not the same stuttering nor as obvious than with Vsync On, when the monitor can jump between two widely different values. But still, it can be obvious when you look for it or when you've become used to not see it by gaming on a VRR monitor. It's definitely does not feel as smooth as when the FPS and refresh rate are synchronized. It's also not as much of an issue on fast refresh rate monitors.

For example, let say we have a 60Hz monitor. It has to display a frame each 1/60th of a second, no way around it. Now, as suggested in some post above, what would happen if the FPS is slightly below this value, let say a constant 50FPS to keep things simple. Meaning a new frame is available each 1/50th of a second. Now the card makes frame #1, the monitor display it, after 1/60th of a second, the monitor has to refresh the frame but frame #2 isn't ready yet since it needs 1/50th, so the monitor has no choice but display frame #1 a second time. After that, things will be fine for the next 4-5 frames, but be out of pace at frame #6 and the cycle will begin again. In the end, what you get is one frame diplayed for 1/30th of a second, then then next few ones for 1/60th, then another one for 1/30th, etc... For an average of 50 FPS, which is different than the stuttering of Vsync on, where all frames would have been displayed for 1/30th, most of them 2x 1/60th, making the game feels smooth but also very sluggish. Or very stuttery if the FPS float around sync speed, with a bunch of frames displayed for 1/30th for maybe a few seconds, than jumping to 1/60th for another few seconds, then back again at 1/30th... Thus making life miserable.

On the other hand, a VRR will simply diplay each frame for 1/50th of a second. It doesn't have to make 50 frames fit in 60 timeframes... It's also less of an issue the higher the refresh rate is.

Tearing happens when the monitor displays a new frame at the same time the card is switching the frame buffers. It's a random event that could happen anytime the card and monitor aren't synchonized. But it will more of a problem above vsync because the probability of this occurence is higher the more the FPS is above Vsync. VRR solves this when inside its sync range.
 
Mar 30, 2019
5
0
10
So guys, I just received my RU8000 Samsung two days ago. I rigorously tested my EVGA RTX 2080 TI and ASUS Radeon VII using a Belkin HDMI 2.1 cable. This TV has FreeSync over HDMI and is 120Hz capable.


Here are the results with game mode enabled and auto motion plus disabled. I tested with freeSync on and off, both basic and ultimate FreeSync. All tests were on 1080p, both 60 and 120Hz.

Nvidia:
Performs BETTER than the AMD VII. Samsung advertises this TV on their website as "the world's first FreeSync TV"
yet a Nvidia card stutters and tears less on it.
Games still stutter with the RTX card but are mostly playable, so no difference from my previous TV
The windmill test performs well with no tearing when Vsync is enabled

AMD:
Severe tearing and stuttering in nearly every game no matter what settings are used in the TV, game or drivers. The
stuttering is 10 times worse to say the least than with my previous GTX 1080 TI on my old TV and current 2080 TI.
The windmill test performs well with no tearing when Vsync is enabled


My frames in FarCry 5 were 88-130 uncapped with AMD on 1080p max, 120Hz. With the RTX they're about 112-160. So the AMD card is definitely capable of maxing out mostly all games but the FreeSync implementation over HDMI is horrific. I thought I might've made an error of judgment seeing that maybe it was only meant for consoles, since Xbox has FreeSync. But on AMD's website, when they talk about FreeSync over HDMI, even say notebooks with Radeon GPUs support it which suggests PC's can do it too.

Geez, I hate when you do your homework and make sure you have all the right stuff and it just doesn't work. Glad I can still return these. I think I'm done trying to game on the big screen, I'm just settling for an ultra wide G-Sync monitor.
 

TJ Hooker

Titan
Ambassador
For example, let say we have a 60Hz monitor. It has to display a frame each 1/60th of a second, no way around it. Now, as suggested in some post above, what would happen if the FPS is slightly below this value, let say a constant 50FPS to keep things simple. Meaning a new frame is available each 1/50th of a second. Now the card makes frame #1, the monitor display it, after 1/60th of a second, the monitor has to refresh the frame but frame #2 isn't ready yet since it needs 1/50th, so the monitor has no choice but display frame #1 a second time.
Not quite how it works. With Vsync off, the monitor would start refreshing the display with frame 1 repeated, but part way through the graphics card would finish frame 2 and swap buffers, such that the rest of the image displayed on the monitor comes from frame 2. The line where frame 1 and 2 meet is where screen tearing happens. So frame 1 would be displayed once in its entirety, then the top ~20% (in theory) of frame 1 would be repeated with the bottom 80% of the display coming from frame 2.