Getting 144hz to display properly

Rick84

Distinguished
Jul 15, 2011
4
0
18,510
Hi all,

I recently bought a Benq XL2411T 144hz monitor but I don't seem to be able to get it to display at 144hz without the screen becoming distorted. I can run it at 120hz no worries. Does anyone know if my system is the limiting factor here? I'm using a Sapphire 6950 2GB Dirt 3 card with monitor's supplied DVI cable.

Here is a photo of the result when upping to 144hz, as print screen displays it in paint as if it's normal. If it is my GFX card that is the limiting factor then this is fine until I next upgrade it later this year.

I know that my current graphics card rarely maintains close to 120fps, let alone 144fps, but the difference is still very noticeable. I just need to ensure that this monitor is not faulty as I'll eventually have a 9000 series to get full use of it once they come out :)

Thanks

o0lu.jpg
 
Well, I'm confused. First off, why do you need 144Hz again? You know your eyes cannot perceive a refresh rate that high, right? We see about 24 frames per second, anything above 60Hz is unquestionably solid and flicker-free.

Do not mix up your VGA's FPS with your monitor's refresh rate. A VGA pushing frames faster than 60FPS may be noticeable because the monitor has to sample that and present it, so a high frame rate ensures a ready buffer at all times.

A high refresh rate, on the other hand, just means the monitor displays the same frame many times before the buffer is refreshed. I don't see a point here.

In an ideal world, your VGA should produce frames infinitely faster than your monitor displays them, so it would never have to wait for buffer synchronization. I believe 60Hz might actually produce a smoother gameplay, specially with Vsync on.

The only point I see in going above 60Hz would be 3D. Then again, 120Hz sounds like the reasonable limit.
 
Your are sadly misinformed. While hz going beyond 60, become less and less noticeable as it goes higher, there is no truth to the fact that we cannot see a difference. You will also find that as you get used to higher and higher FPS, the more pronounced and choppy 60hz will look.

That said, the best results are at 120hz with Lightboost turned on. Read through this article for work arounds to get Lightboost to work on an AMD card: http://www.blurbusters.com/zero-motion-blur/lightboost/
ToastyX's hack is probably the best for you: http://www.blurbusters.com/easy-lightboost-toastyx-strobelight/

As far as the poor results in 144hz, either the cable just isn't good enough to support 144hz or defective, or the monitor is defective at that Hz. Although, there is a chance that it is the video card that is the limiting factor.
 

If you were to actually use a 120/144Hz monitor you'd notice the difference immediately.
 


It seems I was. I wrote that based on personal experience (120Hz) and my understanding of digital signal sampling (Nyquist's sampling theorem). I learned it does not apply and things are acutually a lot more complicated than that:

http://skeptics.stackexchange.com/questions/3348/can-the-human-eye-distinguish-frame-rates-above-60-hz

It does apply, though, to the buffer-monitor interaction. That means unless your VGA can provide high enough a frame rate, higher refresh rates will produce worse results, because of aliasing (not the jagged edges, the frequency-domain phenomenon). Also, Vsync gets increasingly taxating with higher frequencies.

I do apologise for indulging in my own curiosity rather than answering the topic. You may find answers here:

http://www.overclock.net/t/1358464/does-amd-support-144hz/10



(TL;DR)

Your card probably outputs 144Hz only through displayport or a dual-link DVI:


http://www.overclock.net/t/1358464/does-amd-support-144hz/10

http://www.techpowerup.com/reviews/MSI/HD_7970_Lightning/3.html

It seems your card comes with a DL-DVI-I and a SL-DVI-D, so make sure you use the dual-link. Other than that, an active DP to DL-DVI converter would do the trick.
 


Thanks for this, I've just tried that Lightboost and the difference is unreal. It seemed to add a blue-ish hue to my screen though, but I've adjusted with the colours and so forth in the catalyst drivers and got that pretty much sorted.

@ Murissokah, thanks for your answer, I do believe your second response answers my question. Also, I wanted to reply to you last night but I didn't have time. I don't understand the science behind this technology or the biology of our eyes, but I can tell a huge difference between 60hz and 120hz. Not just that, but I don't get any screen tearing either now, although that may change when I get a new GFX card, but from what I've read/understand, the tearing will be much less as it refreshes twice as fast..

 
You actually get the same amount of tears with a 120hz monitor as a 60hz monitor, but the tears get removed from the screen twice as fast. A tear happens every time the video card writes to the front buffer, which doesn't change, but because the 120hz monitor refreshes twice as fast, that tear gets removed much sooner, making it less obvious.
 

There are quite a variety of complex interactions, including whether or not the display is sample-and-hold, or is strobed, how accurate the motion is relative to the framerate (framerate = Hz is ideal), etc.

An excellent demonstration of motion blur is at: www.testufo.com/eyetracking

You will notice that the animation above shows a motion blur that isn't caused by the speed of pixel transitions. Eyes aren't based on frames. As your eyes moves, your eyes are in a different position at the end of a refresh than at the beginning of a refresh. That blurs a static frames, of a flickerfree display, across your eyes. That's the sample and hold effect, as explained in scientific papers. Sample-and-hold motion blur becomes a limiting factor as we get to higher display resolutions, sharper motion, higher framerates (framerate matching refresh rate, zero judders), 4K instead of 1080p, and eventually to virtual reality. It has come to the point where vision researchers are able to notice motion blur on an experimental 1000Hz display. Mathematically, this makes sense. Assuming we were successfully able to run 1000fps at 1000Hz on a 4K display, we have 1000 different static image positions per second. This is since 1 refresh at 1000Hz is 1 millisecond long, and if it's sample-and-hold (flicker free), 1 millisecond of hold time generates 1 pixel of eye-tracking-based motion blur for every 1000 pixels/second motion. That would mean about 4 pixels of motion blur during 4000 pixels/second on a 4K display (1 screen width per second). If GPU's got powerful enough to allow full framerate=Hz at 1000fps on a 4K display, we will still be bottlenecked by motion blur. But we are very, very far away from that technologically right now. If we are wearing 4K displays as Holodeck virtual reality googles, a slow head-turning speed at a mere 30 degrees per second, would move the scenery at about 1 screen width per second on the 4K panel -- which can still create forced/unwanted motion blur even on a 1000fps @ 1000Hz display, that's above-and-beyond your eye-tracking accuracy, causing the motion blur to be detectable even at very high refresh rates. So, no, 60Hz isn't the final frontier, 120Hz does have point of diminishing returns, but the diminishing returns do not stop for a very, very long time far beyond.

Another way to reduce motion blur is strobing (e.g. CRT flicker, plasma flicker, LightBoost strobe backlight, pulse-driven OLED, etc), which eliminates motion blur without needing to raise Hz (e.g. whether by interpolation or true frames). This is a FAR easier way to eliminate motion blur, but not everyone likes flicker. So we can't eat our cake (flickerfree) and eat it too (blurfree), at retina quality resolutions, unless we go to silly framerates (e.g. 1000fps @ 1000Hz). It's impossible to go Holodeck-quality at only 60Hz or 120Hz.

Some good references:

"Why Do Some OLEDs Have Motion Blur"
http://www.blurbusters.com/faq/oled-motion-blur/

List of links to science papers:
http://www.blurbusters.com/references/ (includes Nokia, Toshiba, Panasonic, etc)

TestUFO animations, which are very educational demonstration animations:
www.testufo.com/eyetracking -- motion blur caused by sample-and-hold
www.testufo.com/blackframes -- motion blur reduction by strobing
www.testufo.com/photo -- motion blur from panning (blurry on LCD, sharp on CRT and LightBoost)
www.testufo.com/framerates -- framerate comparision 30fps versus 60fps (versus 120fps if using 120Hz)

id Software’s John Carmack discussed persistence and strobe backlights at QuakeCon:
http://www.youtube.com/watch?v=93GwwNLEBFg&t=5m35s
(And at 12min05sec, he describes how even retina iPad's goes blurry during fast scrolling, and that's not because of LCD pixel speed)

Forum Post: "Why We Need 1000fps @ 1000Hz This Century"
http://www.avsforum.com/t/1484182/why-we-need-1000fps-1000hz-this-century-valve-software-michael-abrash-comments

Valve Software's Michael Abrash mentions the problems of motion blur in virtual reality head-turning
http://blogs.valvesoftware.com/abrash/down-the-vr-rabbit-hole-fixing-judder
(He also mentions the benefits of a theoretical 1000Hz display)
 

TRENDING THREADS