AMD Radeon FreeSync 2: HDR And Low Latency

Status
Not open for further replies.
G

Guest

Guest
If you are on Monitor with 144+ Hz FreeSync and GSync are useless.
 
Three questions:

1. Did I misread something or is there anything in there that addresses where it is now with regard to the low refresh range, especially at low end.

2. With all this coordination required between monitor manufacturers , AMD and Game Developers ... mostly the manufacturers and the increased panel specs, will FreeSync continue to be "free" ? It would seem that improving panel technology to address the described issues would have a cost associated with it.

3. Finally, any sign of MBR coming with Freesync ? This is the big separation between the two technologies in both performance and cost. With the top cards shooting for 60 fps at 2160p, @ 1440p, performance bottoms out at 80 fps in all but a few games, where many find using ULMB to be a better experience then G-Sync. Right now, it's up to the monitor manufacturers whether or not to add in the necessary hardware to add this feature resulting in a hodge podge of different quality solutions. A Freesync 3 where MBR is provided would truly allow AMD to go head to head with nVidia with this feature, but I think we would have to sday goodbye tot he Free part.
 


You would benefit from some reading as this is certainly not the case.
http://www.tftcentral.co.uk/articles/variable_refresh.htm

1. You are playing Witcher 3 at 62 - 70 fps on your 1070, why wouldn't you befit from G-Sync ?

When the frame rate of the game and refresh rate of the monitor are different, [w/ Vsync OFF) things become unsynchronised. This lack of synchronisation coupled with the nature of monitor refreshes (typically from top to bottom) causes the monitor to display a different frame towards the top of the screen vs. the bottom. This results in distinctive ‘tearing’ on the monitor that really bothers some users. Even on a 120Hz or 144Hz monitor, where some users incorrectly claim that there is no tearing, the tearing is still there. It is generally less noticeable but it is definitely still there. Tearing can become particularly noticeable during faster horizontal motion (e.g. turning, panning, strafing), especially at lower refresh rates.

The solution to this tearing problem for many years has been the ‘VSync ON’ option which essentially forces the GPU to hold a frame until the monitor is ready to display it, as it has finished displaying the previous frame. It also locks the frame rate to a maximum equal to the monitor’s refresh rate. Whilst this eliminates tearing, it also increases lag as there is an inherent delay before frames are sent to the monitor. On a 120Hz monitor the lag penalty is half that of a 60Hz monitor and on a 144Hz monitor is even lower. It is still there, though, and some users feel it disconnects them from game play somewhat. When the frame rate drops below the refresh rate of the monitor this disconnected feeling increases to a level that will bother a large number of users. Some frames will be processed by the GPU more slowly than the monitor is able to display them. In other words the monitor is ready to move onto a new frame before the GPU is ready to send it. So instead of displaying a new frame the monitor displays the previous frame again, resulting in stutter. Stuttering can be a major problem when using the Vsync on option to reduce tearing.

During Vsync ON operation, there can also sometimes be a sudden slow down in frame rates when the GPU has to work harder. This creates situations where the frame rate suddenly halves, such as 60 frames per second slowing down to 30 frames per second. During Vsync ON, if your graphics card is not running flat-out, these frame rate transitions can be very jarring. These sudden changes to frame rates creates sudden changes in lag, and this can disrupt game play, especially in first-person shooters.

[With G-Sync / Freesync] By doing this the monitor refresh rate is perfectly synchronised with the GPU. You don’t get the screen tearing or visual latency of having Vsync disabled, nor do you get the stuttering or input lag associates with using Vsync. You can get the benefit of higher frame rates from Vsync off but without the tearing, and without the lag and stuttering caused if you switch to Vsync On.

2. You are playing Witcher 3 at 80 - 95 fps on your 1080, why wouldn't you befit from switching from G-Sync ULMB ?

G-sync modules also support a native blur reduction mode dubbed ULMB (Ultra Low Motion Blur). This allows the user to opt for a strobe backlight system if they want, in order to reduce perceived motion blur in gaming.

Suggest reading the whole article for an accurate description of what G-Syn / Freesync actually does.
 
G

Guest

Guest
As I said free sync and gsync are useless if you do have 144+ Hz monitor. I have SLI 1080 and GSync makes no difference. As Monitors get better and we get to see 4k with 144Hz these technologies will be obsolete.

In the other words, i play games using 144Hz refresh rate on Ultra Settings in resolution 1440p with vsync -> OFF, and gaming is just perfect with no screen tearing.

With monitors with low refresh rates Free Sync and GSync do help otherwise complete waste. I am still trying to understand what is actual purpose of Free Sync and GSync as i saw no benefit from it ever.

People can write whatever they want to, there are many other factors when it comes to game play.

The current games i play, Deus Ex Mankind Divided, Call of Duty Infinitive Warfare, Far Cry Primal, Rise of Tomb Raider. Since i do have SLI 1080 i run games on Windows 7 as Windows 10 is horrible platform for gaming in general.
 

dstarr3

Distinguished


People with high-Hz monitors are the people that need Free/G-Sync the most. If you have a 60Hz monitor, holding a steady 60FPS is a pretty easy task for most computers, so Hz sync isn't so useful because there aren't going to be a lot of fluctuations below that 60Hz. However, holding a steady 144FPS is a lot more challenging for computers. That's nearly three times the FPS. And framerate drops below that threshold are going to be a lot more frequent on even the most robust computers. So, absolutely, refresh sync is going to be very useful if you need to crank out that many frames per second, every single second.
 


1. Blame your eyes and / or lack of understanding of the technology. If you are running SLI'd 1080s at 144 Hz you are not running 4K which won't be capable of doing so until Display Port 1.4 arrives. Again, and understanding of what G-Sync offers would have you using ULMB instead of G-Sync with twin 1080s. You might want to try using the technology as recommended and switch to ULMB above 75 fps

It should be noted that the real benefits of variable refresh rate technologies really come into play when viewing lower frame rate content, around 40 - 75fps typically delivers the best results compared with Vsync on/off. At consistently higher frame rates as you get nearer to 144 fps the benefits of FreeSync (and G-sync) are not as great, but still apparent. There will be a gradual transition period for each user where the benefits of using FreeSync decrease, and it may instead be better to use a Blur Reduction feature if it is provided. On FreeSync screens this is not an integrated feature however, so would need to be provided separately by the display manufacturer.

And just because you have 144hz, and the GFX card performance to keep games above G-Sync range, doesn't make that statement true. Many people out there have 144 / 165 Hz monitors but not twin 1080s and they are operating in the 30 -75 fps range for the more demanding games... and here, having G-sync does clearly make a difference.

2. Now if you want to talk about "makes little difference", we can talk about SLI on the 10xx platform. SLI'd 970s was a proverbial "no brainer" over the 1080 with an average scaling of 70% at 1080p in TPUs Gaming test suite and 96 to > 100% in the most demanding games. For whatever reason, scaling on the 10xx series is now 18% at 1080p and 33% at 1440p. Several reason have been put forth as to why:

a) Devs choosing their priorities optimizing games for DX12 before spending tome on SLI enhancement.

b) CPU improvements in performance over the last 5 generations are about 1/3 of the performance improvement from the 10xx sereis alone.

c) With no competition from AMD for their SLI capable cards, improving SLI performance will accomplish only one thing ... increased sales of 1070s and a corresponding decrease in 1080 sale, the latter being where nVidia makes the most money.

I think you have to actually use the technology "as intended and recommended" before you complain that it has no impact. Again, G-Sync's impact is between 30 and 75 fps; FreeSync's impact is between 40 and 75 fps. Once past that threshold, the impact of this technology starts to wane .... this is not a secret and it in now way makes the technology obsolete. If we ignore MBR technology, G-Sync greatly improves the experience at 1080 p w/ a $200 GFX card; G-Sync greatly improves the experience at 1440 p w/ a $400 GFX card. Yes, you can certainly throw $1200 at the problem to get a satisfactory experience at 1440p w/o using G-Sync, but a 1070 w/ G-Sync leaves $800 in your pocket and works great. If you "overbuilt" anticipating what games might be like 3 years from now, then, nVidia gives you the opportunity to switch to ULMB, giving you the necessary hardware module as part of the "G-Sync package"

144 Hz allows use of the 100 Hz ULMB setting
165 Hz allows use of the 120 Hz ULMB setting

My son is driving a 1440p XB270HU with twin 970s ... he uses ULMB in most games but switched to G-Sync when the game can't maintain > 70 fps. When I visited him I played (65 - 75 fps) using both ULMB and G-Sync and found the experience extraordinary either way.
 
G

Guest

Guest


As I said make no difference. Investing money into expensive monitor 144Hz and having $150+ dollar card is rather stupid. As far as SLI scaling on 1080 goes, what you said is just incorrect. SLI in DX12 does not work as DX12 is pile of crap, overhyped by MS. in fact doing nothing. I do gaming on Windows 7 because SLI scales there. I don't know what tests you were reading since everyone are using Win10 to test SLI, but in Windows 7 SLI on 1080 scales ~70%, in some games 90% like Battlefield. Newer titles like Watch Dogs 2 scales over 70%+. Windows 10 is broken at so many levels to start with but that is for some other discussion. My experience with vsync off and Gsync off is extraordinary as well. As I said it makes no difference. In fact less frames you push less it makes sense as tearing does not happen on something running 40FPS where Monitor refresh rate is 144Hz. You can say whatever you want...i am telling you my experience. More benefits you get by running game on NvMe M.2 drive with 2000Mb/s read than with free sync and gsync.

My second rig has Crossfire R390x and i really thought DX12 would benefit that setup under Windows 10, but not much...not worth going through hassles and bugs of Win10. Again free sync on that one is just useless. I tried it and got worse FPS.

You remind me of one of those people who wanted to argue with me that Windows 7 cannot boot from NvMe Samsung 950 Pro drive under UEFI and it doesn't support 10 Core CPU.

First thing i did after getting 10 Core CPU and 1080 SLI along with NvMe Samsung 950 Pro was to install Windows 7 on M.2 drive and booting from it. Funny thing is that even an argument of Win10 booting faster than Win7 is out of the way...

Don't believe everything you read.
 

computerguy72

Distinguished
Sep 22, 2011
190
1
18,690
Wow Freak777 is oh so ill informed, kinda embarrassing to see. So much technically wrong with that post and I'm glad some have posted to address it. Tech like Freesync helps in numerous ways *even when your gpu is maintaining a very high frame rate. Add in Freesync 2 and scenes with bright brights and dark areas together become feasible with little or no degradation.
 
G

Guest

Guest


I am very well informed and really know the thing. As I said Free Sync and GSync is absolutely useless and found no use of it and it made no difference for me on SLI Nvidia 1080 and Crossfire r390x. So I still wonder what is all that thing about? In fact i got free sync Asus Monitor 144Hz for my Nvidia SLI setup since GSync is just a major rip off doing nothing in my experience and costing much more.

What did you mean by bright brights and dark areas? HDR?

You know what is embarrassing, when number of people claim one thing by strictly reading into what some company says, and here is me who disapproves it in couple hours like booting Windows 7 from Nvme M.2 and booting being fast as Windows 10. Those things people don't like and they give negative comments to it because entire bull s. of Win10 booting faster than Windows 7 is out of water. I am just giving an example here.

 

yyk71200

Distinguished
Mar 10, 2010
877
0
19,160


Do you know that NVidia does not support freesync?
 

dstarr3

Distinguished


You might want to consider the possibility that you are absolutely wrong about everything you think you know about this technology.
 
G

Guest

Guest


I know. Why should i pay more for GSync version of the same monitor which is much more expensive, when i won't be using gsync feature?
 


Repeating it doesn't make it real. 1) You are not even using the technology as recommended 2) You don't seem to have an understanding of which to use in what situation and,as a result, your arguments are just plain wrong and 3) You don't seem to have an awareness of ULMB and never even turned on ULMB so how can you speak from "experience" ?

Don't believe everything you read.

I don't, ... and just because you repeated it 3 times offering no documentation or support doesn't make it less so.

Investing money into expensive monitor 144Hz and having $150+ dollar card is rather stupid.

No mention was made of $150 cards; and 144 Hz is outdated tech and not all that expensive ... have a 3 year old 144 Hz Asus 144 Hz here in the office that cost me $209... and I am running with MBR.

I don't know what tests you were reading

Again, we see the source of the problem ... not reading, source was stated in the post.

18% @ 1080p across 17 game test suite
perfrel_1920_1080.png


33% at 1440p across 17 game test suite
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/20.html

I don't know what tests you were reading since everyone are using Win10 to test SLI

We don't use Win 10, we have yet to do a Win 10 Build for anyone.

Windows 7 SLI on 1080 scales ~70%, in some games 90% like Battlefield

Yes, I said that.... But it's games like TombRaider that actually get into the 90s.... on 9xx series cards, not 10xx

Battlefileld 4 = 71%
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/7.html
Battlefield 3 = 32%
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/6.html

But this statement only reinforces the failure to correctly use the technology. G-Sync and freesync are intended for games running at 40 (30 for nVidia) fps to 75 fps. Now read that again. Now let's look at what you are doing:

BF3 w/ 1080 SLI = 198.5 fps ... G-Sync / Freesync has no applicability here. This is where, if you have nVidia G-Sync ... you turn off G-Sync and use ULMB via the hardware module installed in the monitor. . Read that again and try and get an understanding of the technology involved.
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/6.html

BF4 w/ 1080 SLI = 182.0 fps
G-Sync / Freesync has no applicability here. This is where, if you have nVidia G-Sync ... you turn off G-Sync and use ULMB via the hardware module installed in the monitor. . Read that again and try and get an understanding of the technology involved.
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/7.html

You paid a decent cost premium for that G-Sync monitor, more than you would for Freesysnc, because G-Syns addresses the full spectrum of potential experience. It addresses ths 30 - 75 fps range with G-Sync and the > 75 fps with ULMB. That cost premium got you a expensive hardware module that you have never turned on.

The ranting is akin to screaming at the car salesman that you got stuck in the snow with your Jeep Rubicon and that the 4WD technology "makes no difference". In order for it to make a difference, you have to put it in the appropriate mode for the terrain in question. Using regular 4WD is great for driving on paved roads or light snow, but when in deep snow and "stuck", you have one wheel spinning and the other 4 doing squat. Until you put it in "low locked hub" mode, that will continue to happen. Put it in the correct mode and all 4 wheels will spin and you will get out, once out and back on more drivable areas, you take it out 'low locked hub" mode and put it in regular 4WD (really AWD) mode. Don't blame the car manufacturer because you elected not to use available capabilities of the technology.

Similarly, use the appropriate monitor technology for the situation. I again suggest that you go back and do more reading, specifically the linked tftcentral article to learn when and how to use the technology you have in front of you. Going back to SLI, if Battlefield is your yardstick, I can hardly see the value of an investment in SLI. I just don't see the value of a $600 investment so I can go from 106 fps to 182 fps.... especially when the monitor can't reach that number and w/ one card I have the option of running that 144 Hz monitor at 100 Hz in ULMB mode. I can do that with 1 card just as well as 2 cards.
 
G

Guest

Guest


Whatever, you guys keep paying extra for GSync Monitor, use GSync or Free Sync with AMD cards. I know what i know, and don't care about it.
 


personally i think it is poor choice by AMD to called their variable refresh tech as "freesync". but they choose the word "free" to mock nvidia solution at that time. it was free because there is no need to buy new monitor (just need to upgrade your monitor firmware). also no need to buy new graphic card because with gsync you need nvidia latest architecture at the time (kepler) where as AMD claim that they have the necessary tech for freesync inside their gpu for 3 generations already.

but in the end that "firmware upgrade" never become a reality. and from what we see later on panel maker still need to produce new scaler that is compatible with new vesa adaptive sync spec. and to use freesync in games you still need AMD GCN 1.1 and above. what happen to those 3 generation claim early on?

 
G

Guest

Guest


You had bad 1080 scaling because your CPU is chocking it. Move to 4k and you will see how SLI goes up ~90%.
Every game is a reason for SLI 1080 not only Battlefield. Again, i am not planning to go into deep discussion here. Tomb Raider on 1080 SLI here hits 100FPS.

I have SLI 1080 and i don't care about ULMB and Gsync because i do have perfect gaming experience on guess what , Free Sync Asus 27" 144Hz screen. A reason i bought free sync version is because it was cheaper, in other words why should i pay extra for something i will never use. I know what ULMB is, tried it and don't care about it. It is suppose to give you CRT-like motion clarity but reduces brightness...in my book useless. I don't see any problem with motion clarity on the current monitor. Btw. everything below 60 FPS is unplayable in my book but that's just me.
 

uglyduckling81

Distinguished
Feb 24, 2011
719
0
19,060
freak777power said:
- "I am very well informed and really know the thing. As I said Free Sync and GSync is absolutely useless and found no use of it and it made no difference for me"
- "In fact i got free sync Asus Monitor 144Hz for my Nvidia SLI setup since GSync is just a major rip off doing nothing in my experience and costing much more."

I'm sorry but I laughed pretty hard when I read these comments in the same paragraph.
Freesync doesn't work at all with Nvidia GPU's bro. That's why you didn't notice anything in game.
Not trying to be rude but you really should go and read about what your talking about before saying these silly things.
Thanks for the laugh though, that was a fun read.
 
G

Guest

Guest


I know that FreeSync feature on Free Sync Monitor does not work with Nvidia. Sorry there is nothing to laugh but you lack IQ to understand it. I tested AMD R390x Crossfire with Free Sync and Nvidia SLI setup with GSync monitor. I returned GSync Monitor and got FreeSync Monitor because it was cheaper. So, i have two Free Sync Monitors, one for R390x Crossfire setup where i don't use it anyway and other one for Nvidia SLI setup where it cannot be used.

R390x Crossfire with Free Sync enabled on game like Tomb Raider is just f. horrible experience.
 

MrBonk

Commendable
Oct 13, 2016
7
0
1,510
"But AMD knows it has a mindshare battle to fight against a competitor notorious for tightly controlled, proprietary IP"

Or because it's a superior product?

Personally, give me a monitor that can sync 20-60hz without issue and extra input lag and i'm a happy camper.
 

tanjo

Distinguished
Sep 24, 2011
272
1
18,810
@FREAK777POWER
Setting ULMB aside, Gsync/Freesync isn't needed if your GPU can maintain FPS greater than your display - which is pretty much *YOUR* situation. Them being useless is just your subjective opinion. In case you're wondering, if your FPS drops below your display's (like 72-143 FPS), it will be displayed at half your refresh rate (72Hz) due to unsynced timings of the GPU and the monitor. The GPU keeps churning out max frames but the monitor only changes what is displayed every other frame.

Also, if the monitor you bought is the ASUS MG279Q, its Freesync range is only up to 90Hz so you're stuck there if you activated it.

@UGLYDUCKLING81
If F7P likes that monitor then there's nothing wrong with buying cheaper version (Freesync vs Gsync) since the sync feature is not the reason for buying it. If there's a version with no Freesync/Gsync (which is unlikely), that's probably the one he'd buy.
 
Status
Not open for further replies.