AMD's FreeSync Finally Available; New Tech Details, Too

Status
Not open for further replies.
I was under the impression G-Sync completely replaced having to use V-Sync because the screen matched the frame rate of the software. If that's the case why is there an option to have V-Sync on and off in FreeSync . . . is that just used outside the monitors sync speed range or something?
 

Aver88

Reputable
Mar 19, 2015
5
0
4,510
Reviews of Freesync monitors that are already available on the Internet suggest that it is as good as G-Sync.
 

Robert Ostrowski

Honorable
May 26, 2013
113
0
10,710
v sync is when your gpu is putting out new frames faster than your monitor can display them, if a monitor has a 60hz refresh and your gpu is putting out 80fps. Free sync allows your monitor to adjust to match a slower framerate.
 

Larry Litmanen

Reputable
Jan 22, 2015
616
0
5,010
Man this PC gaming is getting crazy expensive. And to make things worse you spend all that money but games are still lame. Trine 3 is the only game i am looking forward to in 2013.
 


G-Sync does pretty much the exact same thing, essentially works like V-Sync at the frame rate cap of the monitor... So FreeSync just has the option to work exactly like V-Sync is off instead at the frame rate cap and just let it tear instead?
 

Bondfc11

Honorable
Sep 3, 2013
232
0
10,680
No one really knows how well freesync works yet. No detailed review of a wide range monitor exists as of this morning. That LG 48-75Hz review is almost useless not only due to the minimum range, there is no verification of how FS is working, any lag, etc.

It will take a few weeks for us to see really good, in-depth independent lab testing of how freesync is working on a true 120Hz - 144hz gaming panel. How will it work with a 30-144Hz range? Don't know yet, but we will see!
 

N.Broekhuijsen

Distinguished
Jun 17, 2009
3,098
0
20,860

Exactly. Within the specified supported refresh rate range of the monitor, FreeSync will adjust on a per-frame basis. When the framerate goes outside of the supported range, you can opt to use V-Sync to avoid tearing or disable V-Sync in order to have a slightly lower latency. That's only really intended to be used by competitive gamers, though.
G-Sync does pretty much the exact same thing, essentially works like V-Sync at the frame rate cap of the monitor... So FreeSync just has the option to work exactly like V-Sync is off instead at the frame rate cap and just let it tear instead?
 


Alright, that's clear. Do both only still work in full screen mode?


 

Larry Litmanen

Reputable
Jan 22, 2015
616
0
5,010
v sync is when your gpu is putting out new frames faster than your monitor can display them, if a monitor has a 60hz refresh and your gpu is putting out 80fps. Free sync allows your monitor to adjust to match a slower framerate.

So we need a more expensive card to solve a problem that was created by cards become too powerful?

Can't you just turn the setting up more to slow down the frame rates or just get a cheaper card than you have lol.
 
Is this working for any monitor or just certain ones? If it only works for limited number of monitors well...useless.

It only works on certain combinations of video cards and monitors (ie, both the video card and monitor must support FreeSync/G-Sync). Also, G-Sync and FreeSync are competing and aren't compatible... so it's kind of a mess right now. Kinda like the Blu-Ray and HD-DVD, it's probably best to stay on the side lines until a winner is declared.
 

Fokissed

Distinguished
Feb 10, 2010
392
0
18,810
I was under the impression G-Sync completely replaced having to use V-Sync because the screen matched the frame rate of the software. If that's the case why is there an option to have V-Sync on and off in FreeSync . . . is that just used outside the monitors sync speed range or something?
There's 3 things that can happen when FPS is higher than refresh rate. 1 The latest frame can "tear" into the currently displayed frame (normal vsync off operation). 2 The latest frame can be displayed immediately after the last displayed frame is finished showing on the screen (keeping the monitor refreshing at max speed). 3 The latest frame can be skipped in favor of the next frame (which will be drawn as soon as it is finished). 3 will effectively have a lower than max refresh rate, but frames will be displayed as soon as they are finished, rather than 2 displaying a slightly stale frame.
 


In order to skip the frame, wouldn't you have to have a buffer that would allow you to start working on the third frame while waiting on the monitor? I thought the whole point of this was to eliminate double and triple buffering so that the frame on the screen is always the most recent frame rather than a frame behind what the game is doing.

The more and more I read about it the more I realize it might not even be the solution for me in either case. I mostly play border-less full screen in order to keep other data on my second monitor and allow for switching back and forth between things... it seems like neither of these are even going to help that situation at all.
 

dsgffd

Reputable
Mar 5, 2015
4
0
4,510


In order to skip the frame, wouldn't you have to have a buffer that would allow you to start working on the third frame while waiting on the monitor? I thought the whole point of this was to eliminate double and triple buffering so that the frame on the screen is always the most recent frame rather than a frame behind what the game is doing.

The more and more I read about it the more I realize it might not even be the solution for me in either case. I mostly play border-less full screen in order to keep other data on my second monitor and allow for switching back and forth between things... it seems like neither of these are even going to help that situation at all.

G-Sync limits the frame rate to your monitor refresh rate and pushes the frame from GPU to monitor when it is ready, presumably Freesync uses the same method.

Adaptive refresh rates are mostly useful when you cant get enough frames per second to match your monitor's refresh rate.
 


Thanks for that, it is a much better write up and sort of pre-review. It looks like I still need to wait for a monitor... need 24-96hz support or so, 24-25" and 1440/1600p, with a small bezel. I can dream at least, I doubt it will ever come to be.

You'd think the film/movie industry would pick up on this so they could make their 24/30/48/60 content all they like without having to have the TV do gimmicky things to their content.




 

Fokissed

Distinguished
Feb 10, 2010
392
0
18,810


Normal rendering is double buffered. The front buffer is the one that the screen is currently displaying, while the back buffer is in the process of being rendered. As soon as the back buffer is finished, the buffers are flipped, usually causing tearing.

VSync has 3 modes. 1 Double buffered: once the back buffer is completed the GPU waits until the front buffer is completely on the screen to flip the buffers. 2 Triple buffered: Like double buffering, except two back buffers and the GPU only waits if both back buffers are finished. 3 Triple buffered*: The GPU continuously renders between both back buffers and the front buffer flips between the only finished back buffer. 1 has a lot of stutter, 2 has high input lag, 3 has neither but the GPU always works as fast as it can, potentially skipping a previous rendered frame because a newer one is available.

There will always have to be at least 2 buffers, because displaying a buffer while it's being rendered is ugly and flickery. A frame may be displayed in any part of the rendering process: clearing the buffer, rendering objects in Z-order, post processing...
 

PaulBags

Distinguished
Mar 14, 2015
199
0
18,680
It seems a few don't understand the basic premise. I read up on it when I heard about gsync, I imagine the same applys for freesync. Basically, up until gsync/freesync monitors had a set clock rate, and your gpu could either break the cycle and get tearing (no vsync) or wait for the clock cycle to push a frame (vsync). If the gpu has to be slave to the monitors clock rate, and can't quite keep up, it has to halve the frame rate. If the gpus just under 60fps, it'll output at 30, if it's just under 30 it'll output at 15. That's why most, especially budget gamers, play with vsync off.

Now enter gsync/freesync. Basically, the monitor is now the slave to the gpus clock, allowing for tear free gaming at any fps. Because traditional displays decide their own clocks and don't know anything about being asked to adjust it takes a specially designed panel controller chip to allow the monitor to mirror the gpus fps; and that's why not just any monitor will work.

This is also why monitors are unlikely to support both, it would take a hybrid controller chip that could understand both the nvidea and amd standards.
 
Status
Not open for further replies.