As the title asks, I simply can't understand why all Freesync monitors are capped at minimum 30 FPS. Anything below that will cause tearing without vsync enabled.
I know that the controllers can't refresh below 30Hz due to the nature of LCDs, but CRTs in the past also fixed this problem by displaying a frame multiple times. CRTs and Plasma screens would refresh at high (600+ Hz).
I think you know what I'm getting at now.
A really (I mean really) simple solution to counter the minimum refresh rates of most controllers is to display a frame two times, no? (I give an example in the following paragraph)
Say, if the FPS dips below 30 and is now 18, couldn't they make it so the controller refreshes the screen at 18 x 2 = 36 Hz, and display that frame 2 times?
Below 9 FPS, they can multiply that by 4 times, 8 x 4 = 32 Hz and display the frame 4 times. Etc.
This isn't even something to be done hardware-based, I bet that a small driver patch can add this feature.
I know that most will argue that anything below 30 FPS its not playable, but that's not the point here. FPSes above 15 is definitively playable without sutter. What most people consider unplayable is due to the input lag caused by low framerates or VSync.
Can someone add to this or clarify, if that's the case, what's wrong in my reasoning?
Note: I really want this thread to be on point, and not be a discussion about Gsync vs Freesync, or some other **** about fanboys.
I know that the controllers can't refresh below 30Hz due to the nature of LCDs, but CRTs in the past also fixed this problem by displaying a frame multiple times. CRTs and Plasma screens would refresh at high (600+ Hz).
I think you know what I'm getting at now.
A really (I mean really) simple solution to counter the minimum refresh rates of most controllers is to display a frame two times, no? (I give an example in the following paragraph)
Say, if the FPS dips below 30 and is now 18, couldn't they make it so the controller refreshes the screen at 18 x 2 = 36 Hz, and display that frame 2 times?
Below 9 FPS, they can multiply that by 4 times, 8 x 4 = 32 Hz and display the frame 4 times. Etc.
This isn't even something to be done hardware-based, I bet that a small driver patch can add this feature.
I know that most will argue that anything below 30 FPS its not playable, but that's not the point here. FPSes above 15 is definitively playable without sutter. What most people consider unplayable is due to the input lag caused by low framerates or VSync.
Can someone add to this or clarify, if that's the case, what's wrong in my reasoning?
Note: I really want this thread to be on point, and not be a discussion about Gsync vs Freesync, or some other **** about fanboys.