BenQ XL2720Z Monitor Review: A 27-Inch, 144 Hz Gaming Display

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

There are two modes:
Blur Reduction OFF -- It's completely PWM-free, but you have to live with motion blur
Blur Reduction ON -- It's one strobe per refresh, to reduce persistence, for less motion blur

Also, detecting flicker at high Hz such as >100Hz-10,000KHz isn't done directly but via the stroboscopic side effects (e.g. wagonwheel effect, phantom array effect). Humans can't directly see flicker at ultra-high frequencies, but they can be affected by the stroboscopic effects. There are people who actually detect kilohertz flicker, via the wagonwheel effect type side-effects (e.g. spinning wheels look stationary), and sometimes the wagonwheel effect itself induces headaches itself, rather than the flicker itself. Even indoor scenery panning around sideways can produce stepping effects (phantom array effect) for some people, so things as simple as turning your head while walking under a 500Hz squarewave LED flickering light source, can be instantly headache-inducing. See scientific lighting study paper at http://www.lrc.rpi.edu/programs/solidstate/assist/pdf/AR-Flicker.pdf .... Page 6 of this paper has the graph, including people who detected something was flickering at 10KHz via observing(or being bothered by) the stroboscopic/wagonwheel side-effects caused by the light source. Most old fluorescent light ballasts (120Hz flicker) are now replaced by 20,000Hz ballasts to cover the outliers, and reduce headaches by a huge amount.

Now, back to displays... Some people get issues from flicker (seen up to ~100Hz-ish, thresholds vary), other people get issues from the stroboscopic effect (seen far beyond 100Hz) and other people get issues from motion blur (meaning strobing is the lesser evil). On Blur Busters, there are people who actually get headaches from motion blur (e.g. motion blur the worse evil than flicker). It depends on the person. Hundreds of forum posting exists, from places like Overclock.net to other places such as Blur Busters Forums -- where using proper motion-blur-reducing strobing reduced eyestrain (while it increased for others) -- so there are tons of people in the niche market that are actually bothered by motion blur far more than by flicker.

They actually do one strobe per pixel per unique refresh. Mathematically, for proper motion blur reduction, you need to do one strobe. However, several of them 'scan' (e.g. scanning backlights) so multiple flashes may occur of different parts of the screen (segmented scanning backlight). However, regardless of how the screen is strobed, each pixel needs to be strobed once per unique frame or per unique interpolated frame, in order to get effective motion blur reduction.

For example, a 240Hz HDTV with four scanning backlight segments, and calling it "Clear Motion Ratio 960", even though each segment strobes only 240 cycles a second (during 240fps interpolation). That said, it tries achieves 1/960sec persistence even with 240fps material. Different displays do strobing techniques differently, e.g. segmented backlight scanning, versus all-at-once global strobing (easier with edgelights).

Regardless, what this means is that for 120fps, the ideal stroberate is 120 times per second, and for 240fps, the ideal stroberate is 240 times per second, since you want one strobe per frame, otherwise you get double-image effects (like 30fps@60Hz CRT or 60fps@120Hz LightBoost) because during eye-tracking, your eyes has already tracked onwards in the time between the flashes, so the flashes gets occurs in two different places in your retinas. Doing it 3 times, or 4 times, can lead to triple-edge or quadruple-edge effects. However, it doesn't matter if the whole screen is flashed all at once, or scanned sequentially top-to-bottom (like a CRT), the display-based motion blur (persistence) is proportional to how long a pixel is illuminated for to the human eye. High persistence, more motion blur. Low persistence (shorter illumination), less motion blur.

Sometimes reducing detail actually increases motion detail. Otherwise you have a situation of ultra-detailed graphics during standing still, but blurry VHS-quality during fast motion. So, essentially, lowering static detail to gain extra motion detail. So backing off slightly from Ultra settings can be advantageous when you want to increase motion detail. Though, some of us run multiple Titans to allow us to have our cake and eat it too (for the most part), during strobed 120Hz operation.

Disclaimer: I am the creator of Blur Busters, of TestUFO.com motion tests, and of the Strobe Utility mentioned on Page 9 of the TomsHardware review.
 

The "wagonwheel effect" only applies to a strobe lighting an object in motion where the timing correlation between the movement and strobe can create the illusion of slow/stop/reverse motion or discontinuous motion in the case of your lighting research paper. The image on an LCD on the other hand is fundamentally static between screen refreshes so there is nothing to "wagonwheel" with in-between.

Your paper says everyone was very satisfied with anything over 2kHz strobe rate at 100% modulation so the 20+kHz PWM rate on LED-lit LCDs is an order of magnitude beyond what the test group effectively deemed indistinguishable from continuous light.

The reason why electronic FL/HID/sodium/etc. ballasts operate at over 20kHz has nothing to do with "outliers;" it is mainly to avoid producing human-audible whine in HF transformers, discharge arc and filaments. Were it not for that, most ballasts would likely operate at less than 10kHz since that is high enough to prevent arc extinction in discharge lamps, thereby significantly reducing EMI, increasing efficiency and improving tube lifespan by reducing electrode/filament sputtering. They cannot crank frequency arbitrarily high due to parasitic inductance in lighting wiring and lamps - particularly the electronic ballast conversion kits which have to cope with a much broader range of tube types than CFL where the ballast is tuned specifically for the tube type it is bonded to.
 
Monster Cookie:
When do monitor manufacturers understand, that 1080p resolution is a JOKE,
especially on a large 27" screen?
In the early 2000s it might have been ok to have such resolution, but nowadays
it is no longer usable. Even for a 24" screen the minimal resolution is
1920x1200.


In the early 2000s CRTs were still the standard. 4:3 was the standard aspect ratio. There were no 1080p LCD monitors let alone large 1080p LCD monitors, and I paid ~$1200(NZD) for a 17" 1280x1024@60Hz (16ms) LCD display in 2003. That's how bad it was back then.

I paid $ 300 (bargain-favor) for a $ 800 Gateway LCD monitor in 2001, with the same rez and timing. It was expensive compared to CRT but my eyes were safer :)
 
John Carmack and Michael Abrash might have a word with that. Wagonwheel effects and other stroboscopic effects (relative of the wagonwheel effect) also affects displays. I see it in operation too, when I play games. Some people are sensitive to it, while others don't notice it or pay attention to it. Like some people see stutters and others see tearing.

See Michael Abrash's paper that talks of the effect even of sample-and-hold displays:
Michael Abrash: Down The VR Rabbit Hole

Also a very good discussion thread:
So what refresh rate do I need?
(Especially see the 60Hz vs 120Hz photo halfway down the thread, and then do the same test with your human eyes)

It is correct that the image on an LCD is static and strobefree between screen refreshes. However, the stroboscopic effect still exists on such displays when eyes are not tracking the display motion, due to the nature of finite number of images (positions on screen) visually mimicking continuous motion (infinite positions of object in real life). As image moves on a display, such as frames of a wagon wheel spinning, the wheel can look stationary on a display too unless you add artificial motion blur to it, and adding GPU-based motion blur effects is not always desirable (unnatural extra motion blur enforced on your eyes above-and-beyond natural human limitations), especially in fast-action FPS games.

Also, recognize that stroboscopic effects is another artifact of finite-refrehsrate displays. Such as waving a mouse arrow around on a black background, there's a multiple-mouse-arrow effect if you stare stationary while moving mouse. The only way to make the mouse arrow a continuous motion blur, is to add GPU motion blur effect, or make sure there's enough refreshes for each pixel of movement (e.g. 1000 pixel/second mouse movement at 1000Hz, would fill each refresh with a unique mouse arrow cursor position only one pixel away from its previous refresh). Stroboscopic-like effects of this type also occurs on stample-and-hold displays (regardless of strobing or impulse driving) as the static refreshes show objects of static frames in different parts of retinas, without a continous motion path between the frames. Scientific 500Hz and 1000Hz show far less of this problem in the laboratory (e.g. ViewPixx sells a true-500Hz vision scientific research projector display), and they show much less of this stroboscopic/stepping issue. If you stare stationary while motion scrolls past (e.g. Eiffel Tower in the TestUFO Moving Photo Test), you will see a stepping effect (stroboscopic effect) on any display (even flickerfree LCD) as the static frames show on different parts of our retina with no continuous motion path between the individual refreshes. Even 120Hz wouldn't be the final frontier on eliminating all human-detectable effects, and researchers can tell apart 250Hz vs 500Hz via a reduced stroboscopic effect, on a Viewpixx true 500Hz Scientific Projector, or one of the laboratory vision research projectors, and the stroboscopic effect is still visible even at 500Hz (you can move the mouse arrow faster than 500 pixels per second, and lots of computer motion can go faster than 500 pixels per second).

Granted, the stroboscopic effect / phantom array effect is not exactly the same as the wagonwheel effect, but they are very similiar visual phenomenae caused by static frames with no continous motion path between the object positions.

For the usage of displays, the stroboscopic effect is caused by the lack of intermediate motion between adjacent frames, so you don't need to flash between the frames to see a stroboscopic effect, for the stationary-eye-moving-object scenario. This effect is beautifully illustrated in Michael Abrash' diagrams, and well-demonstrated in certain TestUFO motion tests.

The discrete stepping-forward of frames provides its own built-in stroboscopic effect when the eye is not in sync with the motion. The artificial invention of "Frame Rate" (since the zoetropes and kinetoscopes of 19th century) attempting to represent continuous motion using a series of static frames, can create a stroboscopic effect, by the virtue of the series of static frames containing objects in different positions relative to eye stare position. People like John Carmack, Michael Abrash, good VR display developers such as Oculus, and various vision researchers, and other "Blur Busters" minded people (like me) understand the vision phenomena behind this.

The effect is not a major hindrance especially if you use low-persistence via strobe techniques, but the stroboscopic effect exists (during stationary-eye-while-objects-move-past situations). Regardless of 60Hz or 120Hz, and whether the display is flickerfree / sample-and-hold or uses scanning/impulse-driving/strobe-backlight techniques, due to a finite, discrete frame rate being used represent continuous motion. It is true it is not a major problem once we hit 120Hz+ refresh rate, and points of diminishing returns exist, but by all means, it doesn't mean the stroboscopic effect isn't detectable on displays, even at current common refresh rates (and far higher).
 
Most people still crying about the resolution. Until recently I had a 32" Sony HDTV. That thing was a gem to TV on. Crisp, clear etc, even up quite close. I think 27" HD is just on the edge of being perfectly decent in terms of resolution. I agree with the gamers - speed is king in choosing a monitor because it DOES cost you kills. Someone asked why not use a TV as a monitor... well... because of response time too. That's one of the factors that makes a TV inappropriate for serious gaming on a powerful PC.
 
I don't know why people demand QHD monitors with 120-140hz like if it would be too easy achieve 120-140 fps, even now, most demand games at ultra settings at 1080p barely achieve the 120-140fps, you can find any review lets say about the Titan Black and see by yourself what Im talking about and it´s a 1k GPU!, here is a proof:

http://www.bit-tech.net/hardware/graphics/2014/02/26/nvidia-geforce-gtx-titan-black-review/3

So I think that´s the reason why they release monitors like this one, so you can really get those frames.
 


I absolutely agree with you! That's my point as well. Pixel density on a 27" FHD is still decent enough and on the most demanding games you finally get absolutely fluid motion on the high-spec cards on maxed out settings. Even then though, with some games, as soon as you switch on some of the more intensive settings, you'd still fall short of 144fps. I say well done with this monitor - it's super-fast and I'm certain that the resolution would be perfect for a 27" display
 
For a 27" monitor, the clarity of a 1080p monitor becomes as good as it can for the human eye at about 42 inches away. I sit about 35" away from my VG278H 120hz 1080p monitor and can barely see any pixels, and the 120hz, lightboost mod, near non-existant input lag and 1ms refresh time makes gaming a very smooth and blur free experience.


2560 x 1440 monitors only look a tiny bit better at the same distance, but there hasn't been any good gaming monitors that can match the smoothness of gaming TN panels yet. The ROG swift will be the first one that can do this as far as I know and that's not even due out until end of July or something like that.

In any case, the point is that 1080p / 1440p are perfect pixel quality for computer desk viewing distance and 120hz / low refresh rates are what's important when gaming.

You'd have to stick your face 15 inches away from a 4k 27 monitor to see any real difference between a 1440p monitor and a 4k and to top it all off you increase your GPU costs and murder your frame rates using 4k at this current stage in the game.

Unless you can afford 2x Titan Z's and plan on playing 4k resolution on a 50"+ screen on a HTPC or something, 1080p / 1440p is more than good enough for your needs.
 
Serious question. Why not just buy a quality HDTV with 120 (or greater) Hz for your gaming monitor? Especially if you'll be gaming at 1920x1080. A neighbor has his PC hooked up to a quality HDTV and it looks great to me. I've played Battlefield on it with no issues at all. It's pretty awesome!


It depends on your GPU.... some don't have the proper connections available for a 120 hz Television... or don't produce over 1080i @ 30 hz refresh rate through the HDMI connection. most gaming monitors with low ms times, and above 100 hz use DVI or Display port... which the televisions do not always have. So again, it would probably need to be a side by side comparison to see the actual difference for most people, but the monitor will have lower ms times (1 to 5) and less input lag than the TV in most cases, and the capability of running 2 or 3 screens for gaming in Phys-X.
 
Also remember that if your PC doesn't run games at 100 fps or above...this doesn't really matter anyhow.... you won't notice the difference playing an MMO @ 38 fps or less...it will be the same as a 60 hz refresh rate. So this is mainly for fps games or well built gaming rigs.
 
Hard market this monitor will be. Asus released a 27" 4k 60Hz display for $600. Personally I have 2 cards and 60 FPS is fine for me as well as 4k would be fantastic!

that monitor won't work @ 60 hz without Display port 1.2...make sure your card has it, as the HDMI cable will only run it @ 30 hz
 
Status
Not open for further replies.