Why Adaptive Sync (Freesync) monitors can't display 29 FPS and lower?

bloc97

Distinguished
Sep 12, 2010
1,030
0
19,460
As the title asks, I simply can't understand why all Freesync monitors are capped at minimum 30 FPS. Anything below that will cause tearing without vsync enabled.

I know that the controllers can't refresh below 30Hz due to the nature of LCDs, but CRTs in the past also fixed this problem by displaying a frame multiple times. CRTs and Plasma screens would refresh at high (600+ Hz).

I think you know what I'm getting at now.
A really (I mean really) simple solution to counter the minimum refresh rates of most controllers is to display a frame two times, no? (I give an example in the following paragraph)

Say, if the FPS dips below 30 and is now 18, couldn't they make it so the controller refreshes the screen at 18 x 2 = 36 Hz, and display that frame 2 times?
Below 9 FPS, they can multiply that by 4 times, 8 x 4 = 32 Hz and display the frame 4 times. Etc.


This isn't even something to be done hardware-based, I bet that a small driver patch can add this feature.

I know that most will argue that anything below 30 FPS its not playable, but that's not the point here. FPSes above 15 is definitively playable without sutter. What most people consider unplayable is due to the input lag caused by low framerates or VSync.


Can someone add to this or clarify, if that's the case, what's wrong in my reasoning?

Note: I really want this thread to be on point, and not be a discussion about Gsync vs Freesync, or some other **** about fanboys.
 
The problem with this solution is that it will create a sync problem where you are never sure that the next frame will be available so in the case of a single frame that gets created in 35.71ms (like 28FPS) your solution would have it displayed on screen, but what happens now when the next frame is back up to your 60FPS normal rate? Since the primary frame is being displayed then the next frame either needs to start scanning mid-draw which causes a tear or it has to wait to display which causes lag thus defeating the whole purpose of the variable refresh rate tech.

It might be better than if a frame is drawn under the minimum refresh and the next frame isn't available when the first refresh time frame is done that the same frame is displayed again for the minimum amount of time possible (6.94ms in the event of a 144hz panel) which could minimize the likelihood of the next frame being ready but still poses a problem of what to do now that your frame rate and monitor are out of sync. Which again defeats the whole purpose.

You could however do that with known frame rate material like video that's recorded at 24FPS. Since you know that every single frame of the source is meant to be on the screen for 41.667ms you can just display it for two 48hz scans (20.8333 ms each).
 

I see what you mean, but still, you can sync right after the FPS jumps back to 60.

Here's an example where Frame 1 takes longer than 1/30 seconds to display, and frame 2 is within the refresh rate (< 1/30 seconds)
The Frame 1 is displayed twice on the screen, when frame 2 will only be displayed once.
|-----Frame 1--------------------------------------|-----Frame 2-------------|
|-----Sync 1--------------|-----Sync 2-----------|-----Sync 3---------------| etc...

Or maybe I didn't quite understand well your explanation?
 
I want to re-explain my solution in simpler terms. Its like telling the GPU to fake the FPS count, when the FPS drops to 18, It tells the ASync monitor that its rendering at 36FPS, and just outputs the frame it rendered twice.
 

That is only when the monitor's refresh rate is fixed.

Quick example

Vsync off causes Tearing
|----------Frame 1-----------|--Frame 2--|--Frame 3-----|
|--Sync 1--|--Sync 2|--Sync 3--|--Sync 4--|--Sync 5---|

Vsync on -- Added time causes stutter and input lag
|----------Frame 1----------->>>|--Frame 2-->>>>>>|--Frame 3----->>>>|
|--Sync 1--|--Sync 2|--Sync 3--|--Sync 4--|--Sync 5---|--Sync 6--|--Sync 7--|