Asus MG278Q 27-inch QHD FreeSync Gaming Monitor Review

Status
Not open for further replies.
I am flummoxed why even at this stage, the displays are being churned out with HDMI 1.4 and Displayport 1.2 standards. They should have been HDMI 2 with HDCP 2.2 and DP1.3.
 
From what I've seen in reviews so far, then TN model is better anyway. But I would think ASUS fixed all the issues that people were reporting with the IPS model...
 
The last time I tried an ASUS 1440p panel was with their PB278Q (60Hz IPS). The first one I got had terrible back light bleed on the left side and two dead pixels right in the middle. Couldn't live with that. So I returned it and got another. The second one was sealed up on backlight bleed (good enough for the typical PLS/IPS anyway) but had four dead pixels, two which were close together on the center right side and the other two in different spots but in the general viewing area. Again, couldn't live with it and returned for my money back.

I hope their quality control has improved, because for a $500+ monitor, any dead pixels and manufacturing tolerance defects are unacceptable. I paid a little more for a Dell U2713HM and have been happy ever since. I'll be in the market for a 1440p G-sync next year as an SLI 970 owner and would not rule out ASUS if they have improved their quality control. One thing I am not clear on is if you can select custom Free-Sync or G-Sync frequencies to better match your GPU power beyond factory monitor Hz settings (90Hz, 120Hz, 144Hz).
 
At speeds below 40fps, you'll need to turn on V-Sync to prevent tearing, though by that point stutter is the bigger problem. It's better to either reduce resolution or turn down the detail level to keep frame rates above 40.

Uh, what about turning on LFC? LFC will work on monitors with a good variable refresh range such as this Asus unit. I'd like to see that tested for those cases where you dip in frames occasionally.

One thing I am not clear on is if you can select custom Free-Sync or G-Sync frequencies to better match your GPU power beyond factory monitor Hz settings (90Hz, 120Hz, 144Hz).

Wait, what? As long as you're within the variable refresh rate range, you're good to go. If you want to save power and reduce the framerate on a low-demand (old) game something like FRTC should work if there's no in-game cap.
 


No what I'm talking about are complaints about (and this was from G-sync users) that they couldn't set a custom refresh rate to something like 100Hz or 110Hz in the Nvidia control panel on a G-sync monitor to better match their GPU power FPS and cap it. Maybe something's changed or they didn't know what they were talking about (or doing).

I don't have one so I can't comment. I overclock my 1440p monitors to 75Hz (Dell) and 90Hz (Crossover) and cap frames accordingly, but just have never been clear on what that meant to a G-sync monitor that advertises 120Hz/144Hz capability.
 
No what I'm talking about are complaints about (and this was from G-sync users) that they couldn't set a custom refresh rate to something like 100Hz or 110Hz in the Nvidia control panel on a G-sync monitor to better match their GPU power FPS and cap it. Maybe something's changed or they didn't know what they were talking about (or doing).

I don't have one so I can't comment. I overclock my 1440p monitors to 75Hz (Dell) and 90Hz (Crossover) and cap frames accordingly, but just have never been clear on what that meant to a G-sync monitor that advertises 120Hz/144Hz capability.

I'm not sure I fully understand your concern but, if I may, I'll give it a try.
As a user of Asus PG278Q (with G-sync) for a year now, I can tell you this much:
G-sync, much like FreeSync, works within a frame rate range, depending on the monitor and not the adaptive sync technology behind it, in my case within 30-144Hz. Between that frame rate range, the refresh rate is variable and depends on how many FPS your GPU can push.
This is where the similarities between the two stop because outside of that range the two technologies behave differently. Below the minimum range, 30 FPS in my case, the G-sync module automatically displays the same frame twice, making the frame rate appear double than what it is and the gameplay feel smoother. At the other end, G-sync module automatically caps your frame rate to the maximum refresh rate of your monitor (144 in my case).

That being said, having nothing to do with these adaptive sync technologies, Radeons do have a frame rate target control feature in the Catalyst control center (or whatever it's called nowdays) for power savings reasons, feature that you don't have as a Nvidia user.

Now, regarding your concern, a custom refresh rate simply defeats the purpose of having an adaptive sync technology and, outside of power savings reasons, I fail to see how a custom refresh rate target would help since G-sync (and FreeSync for that matter) already cap the refresh rate of your monitor "to better match their GPU power FPS".
If you prefer a custom refresh rate, you can chose do simply disable G-sync and set your (G-sync enabled) monitor to a fixed refresh rate (in my case I have the following options: 24, 60, 85, 100, 120, 144 Hz).
I hope that was helpful.
 
At speeds below 40fps, you'll need to turn on V-Sync to prevent tearing, though by that point stutter is the bigger problem.

This technically isn't true any more, if you are using the Crimson driver and have a panel with a maximum refresh rate that is 2.5 times greater than the minimum (e.g. 144Hz panels).

AMD refers to this new tech as Low Frame Rate Compensation (LFC), and it effectively does the same as Nvidia's solution (although by different means) by duplicating frames to maintain the refresh rate above a minimum refresh value (such as 40Hz). I've been playing around with it on my 390X and my Acer XG270HU and it's been working great, no stutter or hitching, just the usual expected loss in fluidity from going that low in the first place.
 
Tearing happens when fps are higher than hz (multiple frames per refresh appear as horizontal tears in time)
Stutter/Judder happens when fps are lower than hz (multiple refreshes per frame appear as double vision judder)

To sync the two:
Freesync/G-sync adjusts a monitor's hz to match fps.
V-sync adjusts fps to match a monitor's hz.
 
Threw up a little when i heard TN, little to no excuse anymore not to own a IPS monitor since latency is as low as 4ms and i bet 99.9% of you guys can't tell the difference

did you finish the article? right in the conclusion. ----->

" Color, grayscale and gamma accuracy are similar enough that it will be hard to tell the two apart in a side-by-side comparison. The all-important input lag and response tests are so close that even the quickest gaming hands won't be able to discern between them. In fact, the only tests where the more-expensive screen won decidedly were contrast and viewing angles."
 
I see a lot of confusion above:
1) AMD's new Crimson drivers can sort out the issue of being out of asynchronous mode by causing the same screen to refresh, though to work properly you must be able to stay in asynchronous mode at least 2.5X the minimum.

i.e. 30Hz to 75Hz supported for asynchronous mode

2) Some FREESYNC (not GSYNC) monitors don't support asynchronous mode for the entire range (such as 30Hz to 75Hz supported, but 75Hz to 100Hz not supported).

*I really wish they'd made it FULL range support for Freesync or not at all because this gets horribly confused really quick.

3) Frame rate cap?
Someone said that's not needed for asynchronous mode? Actually, it's one of the most important features to have enabled so you can STAY within this mode. Imagine toggling in and out of asynchronous mode.

It would be best just to set the highest cap that lets you stay in asynchronous mode.

4) TN vs IPS?
Yes, TN has gotten better but IPS isn't just slightly better to some people. Unfortunately you'd really have to see the panel in the intended environment to see if it's good enough.

TN helps to reduce the price, especially if comparing TN/Freesync with IPS/GSync. The bottom line is that they have pretty half decent panels now and everything is a trade-off at some point, whether it's motion blur, black levels, contrast or whatever.

5) "I am flummoxed why even at this stage, the displays are being churned out with HDMI 1.4 and Displayport 1.2 standards. They should have been HDMI 2 with HDCP 2.2 and DP1.3."

What's the mystery here?
The level of input only has to support the specs of the device. Creating an HDMI2 input only adds to the cost and serves no purpose.
 
Status
Not open for further replies.