InvalidError :
This seems a little weird to me: the panel uses constant-current to drive LEDs because some people claim to see flicker at ~20kHz PWM frequency yet the very same display uses backlight strobing to reduce blur and this would be occurring at 144-288Hz which is 100X lower.
There are two modes:
Blur Reduction OFF -- It's completely PWM-free, but you have to live with motion blur
Blur Reduction ON -- It's one strobe per refresh, to reduce persistence, for less motion blur
Also, detecting flicker at high Hz such as >100Hz-10,000KHz isn't done directly but via the stroboscopic side effects (e.g. wagonwheel effect, phantom array effect). Humans can't directly see flicker at ultra-high frequencies, but they can be affected by the stroboscopic effects. There are people who actually detect kilohertz flicker, via the wagonwheel effect type side-effects (e.g. spinning wheels look stationary), and sometimes the wagonwheel effect itself induces headaches itself, rather than the flicker itself. Even indoor scenery panning around sideways can produce stepping effects (phantom array effect) for some people, so things as simple as turning your head while walking under a 500Hz squarewave LED flickering light source, can be instantly headache-inducing. See scientific lighting study paper at
http://www.lrc.rpi.edu/programs/solidstate/assist/pdf/AR-Flicker.pdf .... Page 6 of this paper has the graph, including people who detected something was flickering at 10KHz via observing(or being bothered by) the stroboscopic/wagonwheel side-effects caused by the light source. Most old fluorescent light ballasts (120Hz flicker) are now replaced by 20,000Hz ballasts to cover the outliers, and reduce headaches by a huge amount.
Now, back to displays... Some people get issues from flicker (seen up to ~100Hz-ish, thresholds vary), other people get issues from the stroboscopic effect (seen far beyond 100Hz) and other people get issues from motion blur (meaning strobing is the lesser evil). On Blur Busters, there are people who actually get headaches from motion blur (e.g. motion blur the worse evil than flicker). It depends on the person. Hundreds of forum posting exists, from places like Overclock.net to other places such as Blur Busters Forums -- where using proper motion-blur-reducing strobing reduced eyestrain (while it increased for others) -- so there are tons of people in the niche market that are actually bothered by motion blur far more than by flicker.
InvalidError :
DookieDraws :
Serious question. Why not just buy a quality HDTV with 120 (or greater) Hz for your gaming monitor?
Most 120+Hz TVs take 60Hz input and pulse their backlight 2-5X per frame to reduce blur during display refreshes and perceivable flicker.
They actually do one strobe per pixel per unique refresh. Mathematically, for proper motion blur reduction, you need to do one strobe. However, several of them 'scan' (e.g. scanning backlights) so multiple flashes may occur of different parts of the screen (segmented scanning backlight). However, regardless of how the screen is strobed, each pixel needs to be strobed once per unique frame or per unique interpolated frame, in order to get effective motion blur reduction.
For example, a 240Hz HDTV with four scanning backlight segments, and calling it "Clear Motion Ratio 960", even though each segment strobes only 240 cycles a second (during 240fps interpolation). That said, it tries achieves 1/960sec persistence even with 240fps material. Different displays do strobing techniques differently, e.g. segmented backlight scanning, versus all-at-once global strobing (easier with edgelights).
Regardless, what this means is that for 120fps, the ideal stroberate is 120 times per second, and for 240fps, the ideal stroberate is 240 times per second, since you want one strobe per frame, otherwise you get double-image effects (like 30fps@60Hz CRT or 60fps@120Hz LightBoost) because during eye-tracking, your eyes has already tracked onwards in the time between the flashes, so the flashes gets occurs in two different places in your retinas. Doing it 3 times, or 4 times, can lead to triple-edge or quadruple-edge effects. However, it doesn't matter if the whole screen is flashed all at once, or scanned sequentially top-to-bottom (like a CRT), the display-based motion blur (persistence) is proportional to how long a pixel is illuminated for to the human eye. High persistence, more motion blur. Low persistence (shorter illumination), less motion blur.
BlueCyberPhantomX :
so what beast of a pc will run that? i reckon you'd need at least a 780 ti or r9289x to play games at ultra above 55 fps...
Sometimes reducing detail actually increases motion detail. Otherwise you have a situation of ultra-detailed graphics during standing still, but blurry VHS-quality during fast motion. So, essentially, lowering static detail to gain extra motion detail. So backing off slightly from Ultra settings can be advantageous when you want to increase motion detail. Though, some of us run multiple Titans to allow us to have our cake and eat it too (for the most part), during strobed 120Hz operation.
Disclaimer: I am the creator of Blur Busters, of TestUFO.com motion tests, and of the Strobe Utility mentioned on Page 9 of the TomsHardware review.