AMD FreeSync Versus Nvidia G-Sync: Readers Choose

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


AMD most certainly did know about it... maybe the folks at the test didn't but they had been corresponding with various review sites promising a fix for some time. I didn't want to quote the whole thing above, but since the article was rather long, perhaps peeps didn't want to scroll through it....


http://www.tftcentral.co.uk/reviews/content/benq_xl2730z.htm#freesync

From original review dated 24/4/15 [That's April 24, 2105 for folks on our side of the pond]

From a monitor point of view the use of FreeSync creates a problem at the moment on the XL2730Z at the moment. The issue is that the AMA setting does nothing when you connect the screen over DisplayPort to a FreeSync system. This applies whether you are actually using FreeSync or not, you don't even need to have the option ticked in the graphics card settings for the problem to occur. As a result, the setting appears to be in the off state, and changing it to High or Premium in the menu makes no difference to real-World response times or performance. As a result, response times are fairly slow at ~8.5ms G2G and there is a more noticeable blur to the moving image. See the more detailed response time tests in the previous sections for more information, but needless to say this is not the optimum AMA (response time) setting on this screen. For some reason, the combination of FreeSync support and this display disables the AMA function.

This only happens when you are using a FreeSync enabled graphics card, FreeSync capable drivers and the DisplayPort interface. If you switch to DVI or any other interface (which don't support the FreeSync feature) even from the same graphics card/driver then AMA behaves as it should again. If you use DisplayPort but revert to an older non-FreeSync enabled driver package then AMA works as it should. If you use a non-FreeSync supporting AMD card, or a card from NVIDIA/Intel then AMA functions as it should. It's only when all 3 things are combined that the problem seems to occur. Obviously if you eliminate one of them to make AMA work properly, you lose the advantage of FreeSync dynamic refresh rate control. The only exception is if you enable the Blur Reduction mode, where the AMA function then works properly regardless of your system configuration.

Having spoken to BenQ about it the issue is a known bug which apparently currently affects all FreeSync monitors. The AMD FreeSync command disturbs the response time (AMA) function, causing it to switch off. It's something which will require an update from AMD to their driver behaviour, which they are currently working on. It will also require a firmware update for the screen itself to correct the problem. Both are being worked on and we will aim to update this review when it is fixed, hopefully within a couple of weeks. Assuming that fixes the issue the performance when using a FreeSync system should be much better than now, as you can move from AMA Off to the better AMA High setting. At the moment if you use the FreeSync function, or even just have a FreeSync enabled system in place, the response times are slower than they should be by a fair amount, and so you will experience a moderate amount of blur. If you need to, you can always switch to DVI or another interface other than DisplayPort to benefit from the AMA setting (but lose FreeSync).

It is unclear at the moment what would be required to update an existing XL2730Z model, and what would be required in terms of new firmware. We will update this review section when we know more.


Update 1/6/15 [That's June 01, 2105 for folks on our side of the pond]

BenQ have confirmed that the FreeSync/AMA issue has now been fixed. A driver update from AMD is already available and should be downloaded from their website. In addition BenQ will be releasing a firmware update for the monitor itself to fix this issue. Current stocks in distribution are being recalled and updated with retailers so future purchases should already carry this new firmware. This is expected to apply for stock purchased AFTER 1st July, as V002 firmware screens should be shipped by BenQ to distributors in late June.

For those who already have an XL2730Z if you want to, you can return it to BenQ for them to carry out the firmware update for you. This only applies if the user is experiencing issues with the performance of the screen. There is no simple way for the end user to update the firmware themselves and it is not encouraged. Users should contact BenQ support through their relevant country website for more information on how to return their screen for the update.

This only applies in Europe and we do not have any information about how this update will be handled in other countries unfortunately. We would suggest contacting BenQ support for additional help if you need more information, now that a V002 firmware is in circulation. You should be able to identify the firmware version you have by accessing the factory OSD menu (hold menu button while powering the screen on, then press menu). The Firmware version (F/W) should start with V001 or V002 and then a date. You are looking for V002 for the updated firmware.

So, in Europe, you were OK presumably after July 1st, at least with BenQ,.... in the western hemisphere, who knows.







 


Late response here - but this is a key point so worth responding to. What you're suggesting is that bias impacted on the results. It is absolutely NOT bias as long as the "reasons" the respondents guessed which system was which was related to what Tom's set out to test in the first place.

Here's a simple example: You're running a blind trial of a drug with one group taking the drug and another (control) group taking a placebo. The drug works so those in the treatment group get better and those in the control group don't. If you ask those people to guess which group they're in, a good percentage of them are likely to guess correctly. Does that mean their results are biased and should be removed from the sample? Of course not, they're basing their guess on the very thing you set out to test in the first place.

Given Nvidia's proprietary hardware and (generally speaking) higher cost of entry, if there is a noticeable difference between FreeSync & GSync most people would be likely to guess that the better experience comes from GSync. So as long as the "reasons" people guessed the GSync/FreeSync rigs was based on their subjective gaming experience - which is the very thing the whole day was set up to test in first place - then this is NOT bias and is exactly what we'd expect if GSync offers a noticeably better experience. The fact that the majority of people didn't think they knew which system was which actually suggests that FreeSync comes pretty close to matching the GSync experience.

Now we have the one person who was able to tell based on heat - which was obviously not to do with the experience/smoothness (or whatever we want to call it). This does open the door to bias. But for this to have significantly affected the results they would have needed a decent number of Nvidia fans (say 4 or more, nearly 10% of attendees) figuring out which rig was which through some NON GSync/FreeSync related way, not discussed or raised this problem with anyone and then voted with their green-tinted glasses. I concede that this is technically possible, but it would be pretty poor form for a significant minority to take a great opportunity (which heaps of us would have loved to be part of!) and messed with the results like that. Plus... that many Nvidia fans picking up on an tell-tale issue without an AMD fan, TH staff or AMD rep noticing the problem? It's pretty implausible IMHO.

TL : DR -> If respondents guessed which systems was which based on their subjective gaming experience it's absolutely not bias.
 

jdwii

Splendid
Is it really about the fans at all like who cares who makes what all what matters is the product itself. If most monitors with free-sync worked at all hz then it would just be the same as G-sync but the thing is Nvidia knew they had to use hardware to actually make it right which is why G-sync cost more. Not only that but when it comes to "gaming monitors" their is a added fee cause most people who buy a monitor aren't doing it for gaming.


So this comes at a question how come all the comments here seem to think tomshardware performed a bias review when the people who tested the systems didn't know what they were running? Besides the Alex jones tin foil hat nuts what logical reason do they have to think this?
 
The thing is how do you even compare the two Freesync experiences ?

G-Sync comes with ULMB; you can't strobe the monitor w/o hardware which is where the cost comes in and, obviously, both G-Sync monitors had the "hardware". OTOH ...

The BenQ was equipped with Freesync + Motion Blur reduction hardware (strobing) provided by BenQ. The MBR technology **does** in fact come at a cost premium.
The Asus was equipped with Freesync but no Motion Blur reduction hardware

So, tho the MB reduction tech in the BenQ has some issues, one would expect it to fare much better in a G-Sync comparison that the Asus model would. So the test had 3 monitors w/ Motion Blur Reduction and 1 without

http://www.tftcentral.co.uk/reviews/asus_mg279q.htm

No matter how fast the refresh rate and pixel response times are, you cannot eliminate the perceived motion blur without other methods like a blur reduction backlight. Unfortunately that is not provided here on the MG279Q....

No matter how fast the refresh rate and pixel response times are, you cannot eliminate the perceived motion blur without other methods like a blur reduction backlight. Unfortunately that is not provided here on the MG279Q.

G-Sync = Frame Pacing + Ultra Low Motion Blur Hardware
Freesync = Frame Pacing ONLY.

Essentially, the automotive equivalent would be testing SUVs where

G-Syc = SUV with a dash switch to go from 2WD to 4WD when warranted by road conditions
Freesync = SUV with 2 WD

Both do just about equally well on asphalt in good weather, but 4 WD has the obvious advantage when you are in weather or off road.

-NVidia's solution gives you G-Sync for fps from 30-70 and ULMB for 60+ fps allowing you to switch between if desired to match conditions
-AMDs solution gives you Freesync which is a great add up to 70 fps or so but after that it adds little. Asus did not provide any blur reduction technology electing to spend their cost premium on IPS..... BenQ went the other way and provided rudimentary blur reduction but no IPS. With each manufacturer coming up with their own means of blur reduction on Freesync models, the quality of the technology and experience will vary.

Now it could be argued that the testers didn't enable ULMB in any of the tests so it was a fair comparison. Again, using the automotive analogy, I'd argue that would be the same as testing those two SUVs (one 2WD and one 4WD) for all terrain performance without ever enabling 4 WD.
 

ramon zarat

Distinguished
Apr 20, 2010
37
0
18,530
Both AMD and Nvidia are taking us all for fools, guinea pigs and beta testers.

Those syncing technologies are immature, unreliable, costly, nonstandard and will soon disappear for much more advanced and integrated solutions that are REAL improvement instead of putting bandaids on bullet wounds. By then, when AMD and Nvidia drops support, your "G-Sync" or "Free-Sync" monitor will not only obsolete, but useless too!

Lets face it, the DPI/PPI has stayed at the same depressing low figure for decades, the refresh rate has been stuck at 60Hz for ever, LCD panel color to color response time is still atrociously low and of course, the contrast ratio is still very poor compared to plasma and especially O-LED, and there is this huge issue of screen tearing etc, etc, etc... the list of problems goes on and on.

How about a 500 DPI, 600Hz, sub 0.5ms color to color response, ultra high contrast O-LED, no-tearing-no-matter-what PC monitor? Anything less in a pure waste of my time.
 
I must be living in an alternate plane of existence

1. I have not seen any documented evidence that syncing technologies are immature or unreliable. Yes Freesync had some birthing issues but these were resolved as of June 1st. As for non-standard ... isn't Maxwell non standard ? Isn't Tahiti non standard ? Broadwell ? Skylake ? Each vendor has a patented product which, by definition, is non-standard. Didn't Nehalem, triple channel RAM, SATA, IDE, AGP, get "obsoleted" by newer technology ? This is not unique, it's the normal progress of technology. We used to get TV over the airwaves and to go on the internet I had to stick my phone in a cradle. We were charged by the minute so we downloaded all of our e-mails / forum messages, logged off ... then typed our responses, logged in again, sent everything back and logged off. What you are describing has been what it is since the 1980s.
2. In what world has ppi stayed the same ?

1080p on a 24" screen has a ppi of 91.8
1440p on a 27" screen has a ppi of 108.8
2160p on a 27" screen has a ppi of 163.2

3. I have not used a 60Hz monitor in a gaming build in over 4 years.

4. The human eye, for most folks with 20/20 can distinguish individual pixels at about 96 dpi and this is what Windows is actually based on in order to make screen text (ppi) more readable and equivalent to paper text (dpi) and typical viewing distances.


 

Zinic

Reputable
Dec 14, 2014
6
0
4,510
Great experiment. Sounds like you learn a bit from conducting these event. Next one, make sure to send me an invite!
 

jdwii

Splendid


You must be living on the same planet as me :) anyways i love my xb270hu, G-sync, IPS, 4MS, 1440P monitor and i made a jump from a TN+6MS+1080P 32 inch TV WOW the difference sure it cost me 800$ but to me it was worth every cent i like it more then my PC.

It's not even for just gaming even moving windows around feels so smooth vs 60hz.
 

nikolajj

Honorable
Feb 27, 2013
122
2
10,685
It seems with these hardware combinations and even through the mess of the problems with the tests that NVidia's G-Sync provides an obviously better user experience. It's really unfortunate about the costs and other drawbacks (like only full screen).

G-Sync will now (or soon of not), work in windowed mode :) It will then refresh based on the window in focus.
I would really like to see this tech applied to larger panels like a TV. It would be interesting considering films being shot at 24, 29.97, 30, 48 and 60FPS from different sources all being able to be displayed at their native frame rate with no judder or frame pacing tools (like 120hz frame interpolation) that change the image.
 

nikolajj

Honorable
Feb 27, 2013
122
2
10,685
Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.

It seems that you know nothing.
While higher frame rates is better, they will never completely remove tearing, and in fast paced games, the difference is very noticeable, even in a 144hz display.
 
... Intel seems to be jumping on the FreeSync bandwagon. I guess the match has been decided.
I look at that as a "Good News / Bad News" scenario.

If it leads to nice motherboards with DP-out and decent panels with appropriate refresh rates, that's great.

If the panels are so-so and the motherboard selections limited, or feature-constrained by OEMs, it's not so great (think AMD laptops).

 
Yes, I agree.... that does not sound like a good thing ... at least not for AMD. Here's the Top Ten GPUs hitting Steam Servers:

Intel HD Graphics 4000 - 6.32%
NVIDIA GeForce GTX 970 - 3.85%
NVIDIA GeForce GTX 760 - 3.26%
Intel HD Graphics 4400 - 2.94%
NVIDIA GeForce GTX 660 - 2.50%
Intel HD Graphics 2500 - 2.15%
NVIDIA GeForce GTX 650 - 2.11%
NVIDIA GeForce GTX 750 Ti - 2.10%
NVIDIA GeForce GTX 770 - 2.05%
NVIDIA GeForce GT 620M - 1.97

It is noted that AMDs presence has become so small that the survey no longer reports individual cards but lists them as an entire series. This used to be just the R7s and R9s but now Steam has started reporting the older series this way also. If the 7900 series was a single card, it would be in 6th place. As Intel's solutions are gathering speed, holding 3 of the top ten spots, Freesync could have been one of things that pushed users at the low end to a discrete card solution rather than IG. Intel's adoption would mean one less advantage for users to go that route.

And, again, .... there's still motion blur issue:

nVidia Solutioin = G-Sync for < 60 fps / ULMB technology for 60+
AMD Solution = Freesync that's it .... no Motion Blur Reduction technology leaving monitor manufacturers to provide non-standardized on their own,



 

VAnton

Reputable
Aug 28, 2015
1
0
4,510
I'm sure many of the regular readers of this site know this background information already, but since I just happened to create an video on graphics sync issues as part of my "How Software Works" series, I thought I should link to it for anyone interested: https://youtu.be/m3Fg9olLraI.

For myself, I'm intrigued by the technology because A) I'm really bugged by tearing but B) I don't buy bleeding-edge hardware so I'm usually losing a lot of frame rate by turning v-sync on. Unless the price of G/Free sync comes down a lot, though, it still seems like the best bet for me is to just get more powerful hardware, push the frame rate above my monitor's 60 Hz refresh rate, and v-sync.
 
July 2015 Steam Hardware Survey


GPU % of Users Change

Intel HD Graphics 4000 6.15% -0.32%
NVIDIA GeForce GTX 970 3.92% +0.47%
NVIDIA GeForce GTX 760 3.29% +0.03%
Intel HD Graphics 4400 2.85% -0.03%
NVIDIA GeForce GTX 660 2.53% 0.00%
AMD Radeon HD 7900 Series 2.41% +0.04%
NVIDIA GeForce GTX 650 2.17% -0.05%
NVIDIA GeForce GTX 750 Ti 2.14% +0.16%
Intel HD Graphics 2500 2.09% -0.02%
NVIDIA GeForce GTX 770 2.06% -0.03%
 

seeingeyegod

Distinguished
Mar 18, 2009
304
0
18,810
For all the hot air from biased individuals replying to my initial comment, I didn't have to blow dry my air.

Now, I would like to point out that I was enjoying 144fps on 144hz, vsynced, nearly 20 years ago. I don't care about variable refreshes, or any other gimmick. I want true fluidity, and not additional technology to make up for a lack of gpu horsepower.
You guys can go enjoy your pseudo smoothness, but know that it's exactly that: pseudo smoothness.

There were CRTs that had 144hz refresh? I only remember them going up to 110 or so. That must have been some monitor. Your comment is funny because there is nothing stopping you from playing 20 year old games at 1000fps if that is what you enjoy. No one is forcing you to upgrade from Coleco Vision.
 
Status
Not open for further replies.