I've seen your site before; it's why I specified double-blind studies. Advocacy for a particular position doesn't make one wrong, but it does make one highly susceptible to confirmation bias. Your papers are written under the assumption the human eye is a perfect camera: if it were, things like optical illusions wouldn't exist, and "motion" pictures wouldn't move. Long before television existed, the term "motion blur" was coined because in the analog world of reality, the human eye sees blurring where none exists. When you capture a still image from a monitor and display it, it proves nothing about the perceived quality of a fast-moving series of those images.
They're definitely in study now -- and many double blind studies also exist by others at lower refresh rates, but they don't acknowledge certain variables, like how increased resolutions amplify refresh rate limitations.
For example, not many people cover the stutter-to-blur continuum; see for yourself at:
www.testufo.com/eyetracking#speed=-1 -- stare at 2nd UFO for 30 seconds. Low-frequency sample and hold effect appears as stutter, while high-frequency sample and hold effect appears as blur that (when tested on OLED) exactly matches the blur of a specific camera shutter speed, for eye tracking situations.
Now, the motion blur is a function of
the step per frame, when everything else is zeroed out (no GtG, no source-based blur, human is eye-pursuiting accurately). Now, consider that more pixels means more time for eyes to track before the motion disappears off the edge of screen == more time for humans to tell that the motion is less clear than a static image (a problem for VR). So that's, 8000 pixels/sec on 8K display = 8 pixels of motion blur at 1000fps 1000Hz (A great example of how higher resolutions amplify Hz limitations -- an error margin that previous researchers neglected, because we didn't have simultaneous high resolution and high refresh rates in the past).
The explanation is very simple for that screen; slow sample and hold vibrates like a slow vibrating music string (shaky string); while fast sample and hold vibrates fast like a fast vibrating music string (blurry string), if you actually clicked the animation link above and stared at the 2nd UFO for at least 30 seconds to notice.
Be noted that the customized TestUFO animation link above (
www.testufo.com/eyetracking#speed=-1) is mainly designed to be viewed on a flickerless display (aka sample and hold displays) for educations' sake. Now, if you use a CRT or flicker display, the motion blur
component generally disappears, but you still see the stutter mechanics (in a different sense). Just adding this caveat; though most people will be viewing on a LCD / OLED.
Once you do many see-for-yourself tests, some of these tests are so self-educational, that it becomes obvious it is not always necessary to write a double blind test about why 2+2=4, but to correctly combine multiple papers to apply existing knowledge; the lighting industry did studies that resulted in standardizing the 20KHz electronic ballast (instead of 120Hz AC flicker) because of real-world stroboscopic effects.
(
lighting industry paper)
Now, as an equivalent, the same problem also happens to VR flicker-displays because the whole-VR flicker is the equivalent of a whole room being lit by the said flickering fluorescent light. Creating the eye-searing or distracting stroboscopic effect (e.g. rolling eyes during VR creating instant headaches for some people; due to the stroboscopicing effect). Even if people don't get bothered/distracted by it, there are also those who notice (but it's still a visible/noticeable VR abberation away from real-life).
But yes, papers are in the works by multiple authors -- it just that there's a several-year-lag behind emergent technology; some are out now (e.g. VR papers). There's thousands of papers that needs to be written, too few people available, and that's a rising problem when a lot of focus is on other things. However, with the boom of new use cases (VR);
This motion blur is because your analog-moving eyes is interacting with the static stationary frames. As your analog-moving eyes tracks past a frame that's statically displayed (continuously for 1/60sec), the action of your eye pursuit smears the static pixels across your retinas. This creates the blurring effect from the persistence of the pixels (stationary pixels that are continuously lit).
The finite refresh rate of a display, forcing display motion blur during eye tracking situations, is an aberration that is a problem for successfully passing the Holodeck Turing test (not being able to tell apart VR from real life through transparent ski goggles).
Another reason is that some of this is quite obvious in many see-for-yourself tests;
There are four different kind of display artifacts that occur at different situations:
(A) Stationary eye, stationary images;
(B) Stationary eye,
moving images (can cause stroboscopics);
(C)
Moving eye, stationary images (can cause motion blurring or stroboscopics depending on if display is impulsed);
(D)
Moving eye, moving images (can cause motion blur);
Different blind tests and games will force you to eyetrack differently. There's a gigantic amount of variables -- you may not notice unless you design the test to force a specific scenario such as (D) like forcing you to read moving text.
A great example is to test for yourself on one of the new 240Hz OLEDs (where GtG is effectively zeroed out); for the item (D) on the panning-map street-label readability test at
www.testufo.com/map
At 60Hz, you have 16 pixels of motion blur at 960 pixels/sec (1/60th of 960)
At 120Hz, you have 8 pixels of motion blur at 960 pixels/sec (1/120th of 960)
At 240Hz, you have 4 pixels of motion blur at 960 pixels/sec (1/240th of 960)
Try it --
www.testufo.com/map will not match a hand-waved paper map until the refresh rate and frame rate is very high -- to retina-out ALL of items (A)(B)(C)(D) concurrently.
Unless it approaches infinite frame rate at infinite Hz (at least to the human oise floor). But we can't do that, so a sufficiently frame rate at a sufficiently high refresh rate. A 1080p display will have a lower retina refresh rate (e.g. 1000Hz, or pick number) than a 4K display will, because the higher static resolution versus unchanged temporal resolution (At physical motion blur for physical motionspeed, e.g. inches/sec or millimeters/ec or milliradians/sec or whatever physical unit you choose, as long as the pixels are still visible).
Some people don't find 2x all that stunning, but 4x and 8x improvements are even more so (the VHS-vs-4K wow effect, rather than the 720p-vs-1080p meh effect). That's why you need to jump 60Hz-120Hz-1000Hz, for the dramatics up the geometric curve of diminishing returns.
Today, with the newly released 240Hz OLEDs, you compare 60Hz vs 120Hz vs 240Hz at
www.testufo.com -- it's much clearer differences on a 240Hz OLED than a 240Hz LCD; if you try it yourself; the double framerate is exactly half motion blur, if GtG is pushed below human visibility noisefloor. And faster motion (especially on higher PPI / higher angular pixel densities) makes it easier. For example, 8K at 8000 pixels/sec still has 8 pixels of motion blur at 1ms pixel visibility time (either via 1ms impulsing, or via 1000fps 1000Hz).
Retina refresh rate in the most extreme situation (roughly 16K 180-degree VR) is now projected by some of the current researchers to be in the quintuple digits if you completely avoid impulsing, where no weak links remains -- e.g. ANY weak link (blurs, stroboscopics) that makes it impossible for VR to perfectly match real life; as a real-life practical application. A short writing about this is written at these:
It requires extremely high refesh rates to read 6-point text in panning maps or trying to read text mid-scroll. This is unimportant for small screen sizes, but gigantically important for wide-FOV VR screen sizes; since you have more times for things to scroll over longer distances (or headturn = screen panning) to notice the difference between static resolution and motion resolution;
At 1024x768 VGA, doing 2000 pixels/sec motion would be so fast that you can't notice the difference between stationary image versus moving image. But for an 8K VR headset, it takes 4 seconds for 2000 pixels/sec to scroll from one edge to the other edge -- that's a super slow head turn.
However, the motion blur differences is pretty stark in virtual reality. With wider displays with more pixels per angulars, you have more time to analyze a moving image to tell if it's more blurry than a static image (like
www.testufo.com/map except it takes 8 seconds to scroll the screen width of an 8K display at 100% DPI zoom). At some point, the angular resolving resolution becomes retina, and further resolution improvements have no further benefit. However, 4K has dramatically easier Hz-vs-Hz visibility, as does going LCD->OLED, so an 4K OLED would have an extremely high retina refresh rate -- even more so if it's wide-FOV (such as VR).
Motion blur creates headaches in VR, because it is extra motion blur above-and-beyond human vision/brain. So that's why current VR headsets flicker briefly. This is why Valve Index VR headset and Oculus Quest 2 VR headset has to strobe at 0.3ms MPRT (3 pixels of motion blur per 10,000 pixels/sec motion). But real life does not strobe, and some people get eyestrain from PWM (PWM dimming headaches, or nausea from the stroboscopic effects like a dance strobe light). But VR has to flicker because it's the lesser-of-poison versus motion blur in a giant-FOV immersion display such as VR. To do VR at 0.3ms motion blur so without flashing the display, requires 3333fps at 3333hz to have the same motion blur as 0.3ms flashing.
Oh yes, academic papers usually lag by a few years behind technological innovations. Give new papers time to come out. Simultaneous high resolution AND high refresh rate was unobtainium in the past, so researchers didn't quite realize that higher resolutions (at GtG=0) amplified refresh rate visibility.
Viewpixx does sell a 1440Hz vision-research projector. The motion blur mathematics is already clearly stupidly simple when GtG is zeroed out;
I do suggest visiting somebody's 240Hz monitor and configuring one of the thousands of TestUFO tests (custom arguments, etc) for a bunch of see-for-yourself versus combinations. The magic of web-based see-for-yourself tests is pretty handy for demonstrating resolution differences as it pertains to Hz-identifiability, as well as LCD-vs-OLED differences, as well as the geometric curves.
I already do classrooms -- to teach display companies, game vendors, and university researchers to figure out new topics to later double-blind tests. They're coming, but it will take a few more years, because tech only arrived -- it was hard to simultaneously do high Hz and high resolution, and not everyone realized the role of VR and field-of-vision.
Now consider the multiple different outcomes;
(1) Can't see it;
(2) Seeing it but not caring about it;
(3) Seeing it but bothered by it;
This can vary on many variables (eye pursuit or not, display motion or not, specific content that encourages a specific eye gaze behavior, resolution, field of vision, the specific human's maximum eye tracking speed, full immersion like VR or not, etc)
Many games force you to stare at crosshairs often (e.g. CS:GO) so you don't see item "D" (moving image, moving object). But wearing a VR headset, you will stare(gaze) at something while you headturn, and headturns = screen motion blur from the finite frame rate. That's why headsets are forced to impulse, to prevent the sudden motionsickness/headaches caused by blatant amount of motion blur forced on the eyes above-and-beyond natural human vision;
So some applications require it far more than others; it's very hard to five-sigma a VR headset to be indistinguishable from real life; Even, many are completely unable to use VR because of the flicker, even for non-rollercoaster-applications (e.g. standing on a virtual beach), even though the new VR headsets are now vastly much more comfortable 3D than Real3D Cinema glasses; thanks to the ergonomic innovations done thus far (as long as you avoid the rollercoaster apps and stick to 1:1 real:vr movement sync apps, where even head tilts are sync'd, even naturally looking under a virtual computer desk, and all that -- completely eliminating the vertigo element from VR discomfort lineitems when sticking the comfortable-rated apps).
But the fact that the industry made VR headsets flicker (to reduce motion blur), still expanded the VR market much bigger than otherwise, because motion blur during VR was the worse pick-poison element. Solving stroboscopics, flicker, and blur, in lab testing, requires extremely high refresh rates (well into the quadruple digits at least), to reduce the number of people who are nauseated by VR. There's a lot of papers now by others about VR nausea, and the problems (non-vertigo-related) such as display tech limitations;
Much respect to the ongoing works by the current researchers and ongoing double-blind tests; but I wanted to make sure you are aware that you can see-for-yourself too, with the thousands of customizable TestUFO tests such as this parameterized
TestUFO Variable Pulse Width BFI Motion Blur Test Designed For 240Hz Displays (flickers too much on 60Hz displays). This is exactly why I created TestUFO, as a teaching tool; even Samsung uses TestUFO (
https://doi.org/10.1002/sdtp.14002 -- Samsung cites me).
Obviously, thousands more research papers need to be written on all the various nitty line items (like say, a specific function of a specific display, or how a specific human's maximum accurate eye tracking speed changes the threshold of a retina refresh rate of no further humankind return). Not enough lifetime, ha. Have to cherrypick research priorities, so I encourage other researchers to take up these subtopics and research the [bleep] out of them in double-blind tests.
Motion blur is good when comfortable (adding source-based motion blur can prevent motionsickness of low frame rates, that's why GPU blur effect is available in some games for those who get motionsick without it)
But motion blur bad when it's a genuine problem (aka VR trying to emulate real life), forcing VR to look different than real life; creating a motionsickness effect instead of reducing it;
Either way, tons of use cases for my work; some unimportant (enjoying 24fps Hollywood Filmmaker Mode) and some important (e.g. future Holodecks).