So I guess you guys just don't seem to mind ghosting etc?
http://hexus.net/tech/reviews/monitors/82189-acer-xg270hu-freesync-monitor/?page=4
"From our results it seems that the magnitude of the ghosting problem isn't as severe as observed on some other FreeSync panels"
This means STILL THERE right? I mean, you can SEE it...LOL. Let me know when rev2 of these monitors fixes all the issues. Until then this isn't GSYNC.
"In short we don't feel the ghosting on the Acer XG270HU is significant enough to impact the gaming experience."
Well that is subjective though right? I want NONE...So rev2 or as they say:
"That said, AMD could benefit from enforcing stricter standards on monitor vendors to ensure that panels with significant ghosting do not drag the value of the FreeSync package down."
Until they FORCE components that fix the issues, I'll pass and wait for better or Gsync if none exists by the time I buy.
On under 40fps:
"However, the issue did creep into focus when tackling tougher games such as Assassin's Creed Unity, Crysis 3 and Metro Last Light."
More:
"The most significant issue pertains the noticeable transition between FreeSync and non-FreeSync zones, particularly if that's a frequent occurrence. In that instance gamers should consider lowering game settings to avoid dropping out of the FreeSync range and hope that AMD presents a driver fix for the issue in the near future. Nvidia's G-Sync deals with this specific issue by implementing a frame-duplication algorithm for low framerates. The Nvidia driver duplicates frames by varying magnitudes, two times for 19 to 38fps and three times for 14 to 18fps for example, to ensure the refresh rate of the panel stays around or above 40 and thus, cleverly, the gaming experience stays smooth."
So NV has advantages I can't live without currently, though they still haven't put out the monitor I really want yet anyway (but that's my issue with the monitor not gsync performance issues). I also refuse to turn things down to hide issues here.
Tomshardware seems to hate WAY too much on things that are proprietary, and covers for the issues of the crap that isn't. IE, OpenCL lovely, what's CUDA? compare them perf wise? You high? Why would we do that and let you know CUDA kills OpenCL? Freesync is lovely - oops, forgot to mention AMD needs to fix drivers, oh and that ghosting thing. jeez...
Another more point, over the life of the unit, how much will the extra watts costs you to RATE MATCH with AMD vs. an NV Gsync solution if you hate turning things down like me? You can easily make up $100 on the high end of gaming cards in electricity currently with AMD vs. NV over say 3-5yrs of card ownership (god forbid you have kids gaming on it too). TCO means something. It's like arguing Xbox1/ps4 aren't much more than Shield TV, but reality is those $60 games blow up the cost of a console in short order and GRID can give higher quality especially as games evolve and NV upgrades servers repeatedly over time. Given the appropriate connection GRID could bury a console over time as consoles can't speed up to add more effects etc on AAA titles right? Whatever. The extra watts issue might go away with AMD's Fiji stuff but we're talking today.
Also backing unclevesper's comment, the article at hexus states 40hz also hence the problem I mention at 40 above. So yeah, tomshardware found LACKING (a lot lately, reviewing shield without required connection speeds, this review etc). They mention issues in these games:
"However, the issue did creep into focus when tackling tougher games such as Assassin's Creed Unity, Crysis 3 and Metro Last Light."
Again, in those tougher games (and getting worse as games evolve) how much do the extra watts cost over time to keep you from having the issues Hexus etc found? How much crap will you turn down over time to keep there too like Hexus? As games get more taxing you'll end up doing it more frequently as more issue games pop up. meh...