News Most-Anticipated Gaming Monitors of 2023: 500 Hz, OLED, Wide Screen

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
762
1,760
When manufacturers speak about the millions and billions of colors a TV can show.

Remember this. While studies vary, we estimate that humans can see around 1 to 2 million individual colors. One study suggested humans could see 10 million colors, but it was an anomaly among studies.

Let's play it safe and say humans can see anywhere from 1 to 10 million colors.

The sRGB gamut stored in a 3 channel 8-bit container, a format we have been using since the 90s, can display 16.7 million individual colours. More than enough to cover a wide enough range of colors that any wider color gamuts have little to no impact on image quality.

When tech companies claim they can show X percentage of the sRGB gamut, that's valuable information. But when Samsung/Apple/LG/Asus tell you that they can show millions more colors in exotic gamuts, it's marketing baloney.

I love OLED because of the deep blacks, the great viewing angles and local accuracy without light bleeding. I couldn't care less about how many colors OLED can show, or the stupid HDR marketing, displays have been able to display more than enough colors for decades.
 
Last edited:

thisisaname

Distinguished
Feb 6, 2009
913
509
19,760
I wonder if any of todays graphics cards can even generate game data fast enough to create 500 frame a second in 4K. Does HDMI even have the bandwidth at do 4K at 500hz or does you have to drop down to 1080P to get 500Hz?
 
I wonder if any of todays graphics cards can even generate game data fast enough to create 500 frame a second in 4K. Does HDMI even have the bandwidth at do 4K at 500hz or does you have to drop down to 1080P to get 500Hz?
Sure they can if all you care about it the number. First they generate the fake frames using DLSS and then they subsample it back to 4:2:0 rather than 4:4:4 and then you just use DSC data compression. Magic high frame numbers...nobody cares if they are blurry do they :)
 
  • Like
Reactions: thisisaname
It's all good for lit up and bright scenes that look even shiner and showcase all the brightness. But I really really dislike HDR when it comes to the mixture of dark scenes with moving and appearing/disappearing lights. It becomes almost unwatchable.

I posted a while ago here to check if there was a solution but no luck and ended up disabling HDR on my Monitor. Here's the little video I recorded with my phone showing how bad it looks on my Monitor.
https://streamable.com/nxda0d
Yup. That's a backlit LCD display.
 
You're confusing color depth with what is called "color gamut".

Color depth is simply a container. You could use it to store individual colors, you could also use it to store greyscale values, or use it to store just 1 color. You can not determine the color gamut a screen can display by looking at the color depth it supports.



SDR is a "sunray blur" and HDR a "realistic shade" is it? The ambiguous nature of what HDR is supposed to be is just one of its many problems.

99% of the content you watch on a monitor is in the sRGB color gamut, easily understood by any monitor. Enough colors to depict pretty much anything. We've used it for years, and it works incredibly well, it is a well understood gamut by anyone in the imaging business.

I can pick two random sRGB monitors, calibrate them, and they will look pretty damn similar.

HDR is not like this because it needs to rely on so called wide color gamuts like DCI-P3 and plenty of others that all want to become the "HDR standard". No one agrees what HDR should look like. The so called HDR standards are not standards at all, they are ambiguous. How you translate color seen by a spectrometer to data for a HDR gamut is a free-for-all by companies. And since printed images have limited gamuts (that's why we have blackpoint compensation), they can not be used as a reference for HDR.

I can pick two random HDR screens, try to calibrate them, and the colors will looking nothing alike.

There's the question how you create this "HDR" image. If you want a higher dynamic range, the only way to do this is to make monitors incredibly bright, and to get there you need to use blue light, because only blue light carries enough energy to do this. But then there's the problem that you run into color inaccuracies, color temperatures that are off, samsung HDR that suffers from color fringing etc. You trade in color accuracy for HDR.

Then there's the question if all that blue light is even healthy. We know Blue LED are unhealthy for the retina. We know they damage the retina. And now companies are making very high nits HDR screens, where the bright light is coming from blue light. Regardless if you take that blue light through a filter or "quantum dots" to change its hue, that light still carries the energy of blue light and is potentially damaging people's retina.

I personally go out of my way to avoid HDR. Both the HDR content, and HDR screens, can go take a hike. Give me a good OLED and I'm happy. I don't need a 2,000 nits screen, you don't go stare into the bright sky in real life either. Because high energy light using the blue wavelength as a carrier, damages the retina, that's why fishermen have so many eye problems.
Just so much wrong.

Give these a read. https://www.unravel.com.au/understanding-bit-color-depth


 

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
762
1,760

Did you read your own link. Literally in the first sentence:

"If a color space specifies a range of available color intensities (gamut), then bit depth defines how accurate we can be with those specifications "

It is the color gamut that determines which colors we are working which. The color depth (or bit depth if you wish) is nothing more than a container.

Color depth and Color gamut are completely different things, one is a data container, the other is range of hues that defines the pool of colors, or gamut, we are working with.

And it's important to realise these are not the same, because color depth says nothing about the color gamut we are using. To give and extreme example, e-ink monitors and readers accept 8-bit JPEG files. But e-ink do not work in the sRGB color gamut JPEG uses when they display content, they work in either a purely greyscale gamut or a very limited color gamut that looks more like a fresco on plaster than it does a magazine print. I love e-ink, but the data format and bit depth these e-readers use, says absolutely nothing about the color gamut they use.

"A screen is 10-bit and therefore displays X amount of color" is wrong. I can assure you that monitors supporting 10-bit color depth do not display anywhere close to the 1,073 billion colors that was suggested in the thread. I would be surprised if any monitor can even display a tenth of that amount.

A Blu-Ray disc can hold an HD 1080p image, but doesn't have to. There are plenty of Blu-Ray discs with content that is not in HD. A Blu-Ray format is nothing more than a data container and says nothing about the content, resolution or color gamut that is on the disc. I can make a black and white screen and use binary16 and 40 different channels as my color depth. I can also build a giant bookstore with space for 10,000 books yet only have 1 book on the shelves.
 
Last edited:

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
762
1,760

Pantone is a system used by designers. I use a Pantone guide to make sure the people I work with are in agreement that the color we are talking about is the same across the board. And you can do this with Pantone swatches since they are accurately printed swatches which you can check with a spectrophotometer.

If we tried that with screens we would all be looking at a potentially different color since no two screens are calibrated the same.

But Pantone is completely unscientific and useless when talking about displays. Pantone swatches are subtractive color where organic and inorganic inks are used to create a certain color. Monitors use an RGB system where additive color produces light, e-ink being the only exception. Pantone is completely unrelated to how monitors work.
 
Last edited:
Jan 14, 2023
17
11
15
I see there's a lot of scepticism towards high refresh displays. Check out Blurbusters to better understand why we need high refresh my monitors.
We don't need them. Most people buy them because some youtuber told them it's better or because pro gamers do. You aren't going to notice the lack of blur on a 500hz monitor when you're actually playing games. This is just a way to get someone who bought a 240hz monitor last year to "upgrade".
 

Endymio

Reputable
BANNED
Aug 3, 2020
715
258
5,270
Capitalism doesn’t give you what you need. They advertise giving you what you need but they’re just trying to sell you something. You don’t have to buy it and you don’t need it. Embrace, [sic] minimalism, and be happier.
Capitalism is provably the best system tried for allocating resource among the population, and maximizing human happiness. Your attempt to divide the players a capitalist society into an artificial, antagonistic "us" versus "them" is off target. If you truly believe in your minimalism, why not emigrate to a non-capitalist country like Cuba or North Korea for a few years, then report back to us?
 

Endymio

Reputable
BANNED
Aug 3, 2020
715
258
5,270
Human vision sensitivity has been verified scientifically out to about 800hz. ... Human vision may have its limits, but it operates relatively continuously rather than discretely, which means you can potentially take in information as soon as it is present on the screen.
Neurons take time to fire, just as transistors take time to switch. Nothing in the human body works instantaneously. I'd be willing to bet money that, in any properly-constructed double-blind study, no one can tell the difference between 240 and 360hz, much less 500.
 
D

Deleted member 14196

Guest
Capitalism is provably the best system tried for allocating resource among the population, and maximizing human happiness. Your attempt to divide the players a capitalist society into an artificial, antagonistic "us" versus "them" is off target. If you truly believe in your minimalism, why not emigrate to a non-capitalist country like Cuba or North Korea for a few years, then report back to us?
Yeah, yeah I just meant that advertising always advertises a need, but they never deliver the need.
They’re just trying to sell you some product. I shouldn’t have put the word capitalism in there but under capitalism there is a lot of advertising.

these toms hardware articles are nothing but advertisement
 

husker

Distinguished
Oct 2, 2009
1,241
236
19,670
When manufacturers speak about the millions and billions of colors a TV can show.

Remember this. While studies vary, we estimate that humans can see around 1 to 2 million individual colors. One study suggested humans could see 10 million colors, but it was an anomaly among studies.

Let's play it safe and say humans can see anywhere from 1 to 10 million colors.

The sRGB gamut stored in a 3 channel 8-bit container, a format we have been using since the 90s, can display 16.7 million individual colours. More than enough to cover a wide enough range of colors that any wider color gamuts have little to no impact on image quality.

When tech companies claim they can show X percentage of the sRGB gamut, that's valuable information. But when Samsung/Apple/LG/Asus tell you that they can show millions more colors in exotic gamuts, it's marketing baloney.

I love OLED because of the deep blacks, the great viewing angles and local accuracy without light bleeding. I couldn't care less about how many colors OLED can show, or the stupid HDR marketing, displays have been able to display more than enough colors for decades.
Yes, as long as the millions of colors a monitor can produce fully overlaps with the millions of colors the human eye can detect. There is no guarantee that they do. In that case, you may need a monitor that can produce all <insert insanely large number here> colors to make sure the monitor fully covers human vision.
 

Chung Leong

Reputable
Dec 6, 2019
494
193
4,860
Neurons take time to fire, just as transistors take time to switch. Nothing in the human body works instantaneously. I'd be willing to bet money that, in any properly-constructed double-blind study, no one can tell the difference between 240 and 360hz, much less 500.

People don't even see the guy in a gorilla suit in the middle of the screen, beating on his chest.

 

Madkog

Distinguished
Nov 27, 2016
12
1
18,515
Looks like some pretty fun monitors coming out.

Ultra wide is wide enough for me. Not sure about the 800 curve on the one presented. Used to a 1000 curve. I like a subtle non aggressive curve in my monitors.

Future proof I guess if the OLED does not burn in. Maybe two to three gens worth of GPU upgrades if you go the top tier.
 

mdrejhon

Distinguished
Feb 12, 2008
71
10
18,645
Neurons take time to fire, just as transistors take time to switch. Nothing in the human body works instantaneously. I'd be willing to bet money that, in any properly-constructed double-blind study, no one can tell the difference between 240 and 360hz, much less 500.
There's blind studies -- but you need GtG=0ms as well as massive geometric differences. With some blind variables like panning map street-label readability test (e.g. map scrolling at 1000+ pixels/sec) -- 240Hz-vs-1000Hz is now shown more visible than 144Hz-vs-240Hz, but it required a very dramatic jump up the diminishing curve of returns.

I am now in over 25 peer reviewed research papers (purple "Research" tab at top of Blur Busters).

240Hz-vs-360Hz is a 1.5x difference throttled to only a 1.1x difference due to LCD GtG problems (amongst other issues). But with OLED, the ballgame changes, and 2x-to-4x Hz differences are very visible;

There are multiple indirect-visibility effects of refreshrates:

The easiest scientific way to explain human-visibility is via these two images:

1. Stroboscopic Effect
See this image below;
project480-mousearrow-690x518.jpg


2. Display Motion Blur.
At GtG=0ms, the difference of 240fps 240Hz versus 1000fps 1000Hz
The blur difference is the same as a 1/240sec photograph vs 1/1000sec photograph
display-persistence-blur-equivalence-to-camera-shutter.png

motion_blur_from_persistence_on_sample-and-hold-displays.png


If you're interested in reading more, check the textbook reading at Blur Busters Area 51: Display Science, Research & Engineering -- it also has the link to my citations in papers on Google Scholar too!

This is also very important if you want low-persistence sample and hold. Motion blur is pixel visibility time. The Oculus Quest 2 and the Valve Index strobes at 0.3ms flashes per frame. You'd need 3333fps 3333Hz to get the same motion clarity as 0.3ms strobing.

There is a long diminishing curve of returns, and at some point you need 4x refresh rate differences (AND a pixel response speed of nearly 0ms) to really tell the difference. Keep in mind that 60Hz-vs-120Hz is a 8.3ms blur difference, and 120Hz-vs-1000Hz is a 7.3ms blur difference (assuming sample and hold, no strobing, instant pixel response, and framerate=Hz).
 
  • Like
Reactions: drivinfast247

mdrejhon

Distinguished
Feb 12, 2008
71
10
18,645
Pushing to the Extreme for the 0.1% of people who'll make use of "500Hz" monitors.
On the other hand, completely useless for regular consumers and organizations.
See above diagrams.

Just like 4K was $10,000 in 2001, and now a $299 walmart sale -- as long as cheap, high Hz has lots of mainstream benefits. Even Apple is prototyping 240Hz OLEDs for later this decade.
 

zecoeco

Prominent
BANNED
Sep 24, 2022
83
113
710
See above diagrams.

Just like 4K was $10,000 in 2001, and now a $299 walmart sale -- as long as cheap, high Hz has lots of mainstream benefits. Even Apple is prototyping 240Hz OLEDs for later this decade.
Its true. But for now, I can't see any real world application for a 500Hz monitor in the consumer field. Not even the fastest GPU on the planet can give you a stable 500fps on 1080p high settings AAA games.

However, for the next 10 years, it is highly possible.
 

Endymio

Reputable
BANNED
Aug 3, 2020
715
258
5,270
If you're interested in reading more, check the textbook reading at Blur Busters...
I've seen your site before; it's why I specified double-blind studies. Advocacy for a particular position doesn't make one wrong, but it does make one highly susceptible to confirmation bias. Your papers are written under the assumption the human eye is a perfect camera: if it were, things like optical illusions wouldn't exist, and "motion" pictures wouldn't move. Long before television existed, the term "motion blur" was coined because in the analog world of reality, the human eye sees blurring where none exists. When you capture a still image from a monitor and display it, it proves nothing about the perceived quality of a fast-moving series of those images.
 
Pantone is a system used by designers. I use a Pantone guide to make sure the people I work with are in agreement that the color we are talking about is the same across the board. And you can do this with Pantone swatches since they are accurately printed swatches which you can check with a spectrophotometer.

If we tried that with screens we would all be looking at a potentially different color since no two screens are calibrated the same.

What standard does one use to calibrate a monitor?
 

mdrejhon

Distinguished
Feb 12, 2008
71
10
18,645
I've seen your site before; it's why I specified double-blind studies. Advocacy for a particular position doesn't make one wrong, but it does make one highly susceptible to confirmation bias. Your papers are written under the assumption the human eye is a perfect camera: if it were, things like optical illusions wouldn't exist, and "motion" pictures wouldn't move. Long before television existed, the term "motion blur" was coined because in the analog world of reality, the human eye sees blurring where none exists. When you capture a still image from a monitor and display it, it proves nothing about the perceived quality of a fast-moving series of those images.
They're definitely in study now -- and many double blind studies also exist by others at lower refresh rates, but they don't acknowledge certain variables, like how increased resolutions amplify refresh rate limitations.

For example, not many people cover the stutter-to-blur continuum; see for yourself at: www.testufo.com/eyetracking#speed=-1 -- stare at 2nd UFO for 30 seconds. Low-frequency sample and hold effect appears as stutter, while high-frequency sample and hold effect appears as blur that (when tested on OLED) exactly matches the blur of a specific camera shutter speed, for eye tracking situations.

Now, the motion blur is a function of the step per frame, when everything else is zeroed out (no GtG, no source-based blur, human is eye-pursuiting accurately). Now, consider that more pixels means more time for eyes to track before the motion disappears off the edge of screen == more time for humans to tell that the motion is less clear than a static image (a problem for VR). So that's, 8000 pixels/sec on 8K display = 8 pixels of motion blur at 1000fps 1000Hz (A great example of how higher resolutions amplify Hz limitations -- an error margin that previous researchers neglected, because we didn't have simultaneous high resolution and high refresh rates in the past).

The explanation is very simple for that screen; slow sample and hold vibrates like a slow vibrating music string (shaky string); while fast sample and hold vibrates fast like a fast vibrating music string (blurry string), if you actually clicked the animation link above and stared at the 2nd UFO for at least 30 seconds to notice.

Be noted that the customized TestUFO animation link above (www.testufo.com/eyetracking#speed=-1) is mainly designed to be viewed on a flickerless display (aka sample and hold displays) for educations' sake. Now, if you use a CRT or flicker display, the motion blur component generally disappears, but you still see the stutter mechanics (in a different sense). Just adding this caveat; though most people will be viewing on a LCD / OLED.

Once you do many see-for-yourself tests, some of these tests are so self-educational, that it becomes obvious it is not always necessary to write a double blind test about why 2+2=4, but to correctly combine multiple papers to apply existing knowledge; the lighting industry did studies that resulted in standardizing the 20KHz electronic ballast (instead of 120Hz AC flicker) because of real-world stroboscopic effects.

stroboscopic_detection.png

(lighting industry paper)

Now, as an equivalent, the same problem also happens to VR flicker-displays because the whole-VR flicker is the equivalent of a whole room being lit by the said flickering fluorescent light. Creating the eye-searing or distracting stroboscopic effect (e.g. rolling eyes during VR creating instant headaches for some people; due to the stroboscopicing effect). Even if people don't get bothered/distracted by it, there are also those who notice (but it's still a visible/noticeable VR abberation away from real-life).

But yes, papers are in the works by multiple authors -- it just that there's a several-year-lag behind emergent technology; some are out now (e.g. VR papers). There's thousands of papers that needs to be written, too few people available, and that's a rising problem when a lot of focus is on other things. However, with the boom of new use cases (VR);

This motion blur is because your analog-moving eyes is interacting with the static stationary frames. As your analog-moving eyes tracks past a frame that's statically displayed (continuously for 1/60sec), the action of your eye pursuit smears the static pixels across your retinas. This creates the blurring effect from the persistence of the pixels (stationary pixels that are continuously lit).

The finite refresh rate of a display, forcing display motion blur during eye tracking situations, is an aberration that is a problem for successfully passing the Holodeck Turing test (not being able to tell apart VR from real life through transparent ski goggles).

Another reason is that some of this is quite obvious in many see-for-yourself tests;

There are four different kind of display artifacts that occur at different situations:

(A) Stationary eye, stationary images;
(B) Stationary eye, moving images (can cause stroboscopics);
(C) Moving eye, stationary images (can cause motion blurring or stroboscopics depending on if display is impulsed);
(D) Moving eye, moving images (can cause motion blur);

Different blind tests and games will force you to eyetrack differently. There's a gigantic amount of variables -- you may not notice unless you design the test to force a specific scenario such as (D) like forcing you to read moving text.

A great example is to test for yourself on one of the new 240Hz OLEDs (where GtG is effectively zeroed out); for the item (D) on the panning-map street-label readability test at www.testufo.com/map
At 60Hz, you have 16 pixels of motion blur at 960 pixels/sec (1/60th of 960)
At 120Hz, you have 8 pixels of motion blur at 960 pixels/sec (1/120th of 960)
At 240Hz, you have 4 pixels of motion blur at 960 pixels/sec (1/240th of 960)
Try it -- www.testufo.com/map will not match a hand-waved paper map until the refresh rate and frame rate is very high -- to retina-out ALL of items (A)(B)(C)(D) concurrently.

Unless it approaches infinite frame rate at infinite Hz (at least to the human oise floor). But we can't do that, so a sufficiently frame rate at a sufficiently high refresh rate. A 1080p display will have a lower retina refresh rate (e.g. 1000Hz, or pick number) than a 4K display will, because the higher static resolution versus unchanged temporal resolution (At physical motion blur for physical motionspeed, e.g. inches/sec or millimeters/ec or milliradians/sec or whatever physical unit you choose, as long as the pixels are still visible).

Some people don't find 2x all that stunning, but 4x and 8x improvements are even more so (the VHS-vs-4K wow effect, rather than the 720p-vs-1080p meh effect). That's why you need to jump 60Hz-120Hz-1000Hz, for the dramatics up the geometric curve of diminishing returns.

Today, with the newly released 240Hz OLEDs, you compare 60Hz vs 120Hz vs 240Hz at www.testufo.com -- it's much clearer differences on a 240Hz OLED than a 240Hz LCD; if you try it yourself; the double framerate is exactly half motion blur, if GtG is pushed below human visibility noisefloor. And faster motion (especially on higher PPI / higher angular pixel densities) makes it easier. For example, 8K at 8000 pixels/sec still has 8 pixels of motion blur at 1ms pixel visibility time (either via 1ms impulsing, or via 1000fps 1000Hz).

Retina refresh rate in the most extreme situation (roughly 16K 180-degree VR) is now projected by some of the current researchers to be in the quintuple digits if you completely avoid impulsing, where no weak links remains -- e.g. ANY weak link (blurs, stroboscopics) that makes it impossible for VR to perfectly match real life; as a real-life practical application. A short writing about this is written at these:

It requires extremely high refesh rates to read 6-point text in panning maps or trying to read text mid-scroll. This is unimportant for small screen sizes, but gigantically important for wide-FOV VR screen sizes; since you have more times for things to scroll over longer distances (or headturn = screen panning) to notice the difference between static resolution and motion resolution;

At 1024x768 VGA, doing 2000 pixels/sec motion would be so fast that you can't notice the difference between stationary image versus moving image. But for an 8K VR headset, it takes 4 seconds for 2000 pixels/sec to scroll from one edge to the other edge -- that's a super slow head turn.

However, the motion blur differences is pretty stark in virtual reality. With wider displays with more pixels per angulars, you have more time to analyze a moving image to tell if it's more blurry than a static image (like www.testufo.com/map except it takes 8 seconds to scroll the screen width of an 8K display at 100% DPI zoom). At some point, the angular resolving resolution becomes retina, and further resolution improvements have no further benefit. However, 4K has dramatically easier Hz-vs-Hz visibility, as does going LCD->OLED, so an 4K OLED would have an extremely high retina refresh rate -- even more so if it's wide-FOV (such as VR).

Motion blur creates headaches in VR, because it is extra motion blur above-and-beyond human vision/brain. So that's why current VR headsets flicker briefly. This is why Valve Index VR headset and Oculus Quest 2 VR headset has to strobe at 0.3ms MPRT (3 pixels of motion blur per 10,000 pixels/sec motion). But real life does not strobe, and some people get eyestrain from PWM (PWM dimming headaches, or nausea from the stroboscopic effects like a dance strobe light). But VR has to flicker because it's the lesser-of-poison versus motion blur in a giant-FOV immersion display such as VR. To do VR at 0.3ms motion blur so without flashing the display, requires 3333fps at 3333hz to have the same motion blur as 0.3ms flashing.

Oh yes, academic papers usually lag by a few years behind technological innovations. Give new papers time to come out. Simultaneous high resolution AND high refresh rate was unobtainium in the past, so researchers didn't quite realize that higher resolutions (at GtG=0) amplified refresh rate visibility.

Viewpixx does sell a 1440Hz vision-research projector. The motion blur mathematics is already clearly stupidly simple when GtG is zeroed out;

I do suggest visiting somebody's 240Hz monitor and configuring one of the thousands of TestUFO tests (custom arguments, etc) for a bunch of see-for-yourself versus combinations. The magic of web-based see-for-yourself tests is pretty handy for demonstrating resolution differences as it pertains to Hz-identifiability, as well as LCD-vs-OLED differences, as well as the geometric curves.

D3D78197-EE56-404A-8B63-50552F91207C-740x555.jpeg


I already do classrooms -- to teach display companies, game vendors, and university researchers to figure out new topics to later double-blind tests. They're coming, but it will take a few more years, because tech only arrived -- it was hard to simultaneously do high Hz and high resolution, and not everyone realized the role of VR and field-of-vision.

Now consider the multiple different outcomes;
(1) Can't see it;
(2) Seeing it but not caring about it;
(3) Seeing it but bothered by it;

This can vary on many variables (eye pursuit or not, display motion or not, specific content that encourages a specific eye gaze behavior, resolution, field of vision, the specific human's maximum eye tracking speed, full immersion like VR or not, etc)

Many games force you to stare at crosshairs often (e.g. CS:GO) so you don't see item "D" (moving image, moving object). But wearing a VR headset, you will stare(gaze) at something while you headturn, and headturns = screen motion blur from the finite frame rate. That's why headsets are forced to impulse, to prevent the sudden motionsickness/headaches caused by blatant amount of motion blur forced on the eyes above-and-beyond natural human vision;

So some applications require it far more than others; it's very hard to five-sigma a VR headset to be indistinguishable from real life; Even, many are completely unable to use VR because of the flicker, even for non-rollercoaster-applications (e.g. standing on a virtual beach), even though the new VR headsets are now vastly much more comfortable 3D than Real3D Cinema glasses; thanks to the ergonomic innovations done thus far (as long as you avoid the rollercoaster apps and stick to 1:1 real:vr movement sync apps, where even head tilts are sync'd, even naturally looking under a virtual computer desk, and all that -- completely eliminating the vertigo element from VR discomfort lineitems when sticking the comfortable-rated apps).

But the fact that the industry made VR headsets flicker (to reduce motion blur), still expanded the VR market much bigger than otherwise, because motion blur during VR was the worse pick-poison element. Solving stroboscopics, flicker, and blur, in lab testing, requires extremely high refresh rates (well into the quadruple digits at least), to reduce the number of people who are nauseated by VR. There's a lot of papers now by others about VR nausea, and the problems (non-vertigo-related) such as display tech limitations;

Much respect to the ongoing works by the current researchers and ongoing double-blind tests; but I wanted to make sure you are aware that you can see-for-yourself too, with the thousands of customizable TestUFO tests such as this parameterized TestUFO Variable Pulse Width BFI Motion Blur Test Designed For 240Hz Displays (flickers too much on 60Hz displays). This is exactly why I created TestUFO, as a teaching tool; even Samsung uses TestUFO (https://doi.org/10.1002/sdtp.14002 -- Samsung cites me).

Obviously, thousands more research papers need to be written on all the various nitty line items (like say, a specific function of a specific display, or how a specific human's maximum accurate eye tracking speed changes the threshold of a retina refresh rate of no further humankind return). Not enough lifetime, ha. Have to cherrypick research priorities, so I encourage other researchers to take up these subtopics and research the [bleep] out of them in double-blind tests.

Motion blur is good when comfortable (adding source-based motion blur can prevent motionsickness of low frame rates, that's why GPU blur effect is available in some games for those who get motionsick without it)

But motion blur bad when it's a genuine problem (aka VR trying to emulate real life), forcing VR to look different than real life; creating a motionsickness effect instead of reducing it;

Either way, tons of use cases for my work; some unimportant (enjoying 24fps Hollywood Filmmaker Mode) and some important (e.g. future Holodecks).
 
Last edited:

jessejames840

Honorable
Feb 5, 2018
3
0
10,520
And that 0.1% of people will only be convincing themselves that they can actually tell a difference. Maybe 1% of that 0.1% can really see it.

500Hz may be a somewhat impressive technological accomplishment, but the only real purpose it has is to give marketing people even more power over those gullible individuals who have more money than sense.
What a bunch of bollocks, everyone can see the difference, the higher fps/hz the "more" u see, we need 1000hz monitors.