News Samsung's 500Hz Odyssey G6 OLED joins stacked 2025 gaming monitor lineup

It really is unbelievable the rate of change there are in monitors these days.

Refresh rates are great up to a point, and I am pretty certain that we are well beyond the point of not only diminishing returns, but beyond the point of any returns if we are past 300 hz.

I am much more interested in varying aspect ratios and curves which will fill far more people's needs than the eventual 1200 hz refresh monitors that have to falsify 90% of the frames they produce.

I already burned $3000 on monitors recently, and about the only thing that would cause me to replace my monitor setup anytime soon would be 128:27 75" 800R monitor.
 
Refresh rates are great up to a point, and I am pretty certain that we are well beyond the point of not only diminishing returns, but beyond the point of any returns if we are past 300 hz.
It's less about the perception and more about the input lag once you get to a certain point. The returns are beyond minimal, but running something at 8000hz polling is only really beneficial with high refresh rates.

Actually, my concern is if these 500hz panels will be using DP 2.0 with a UHBR 20 certification. DSC sucks if you tab out of your games a lot, and it's basically required at these speeds (and with anything NVidia ATM).
 
Refresh rates are great up to a point, and I am pretty certain that we are well beyond the point of not only diminishing returns, but beyond the point of any returns if we are past 300 hz.
There are pretty credible sources for there being value in responses up to about 2khz. My guess is that monitor refresh rates will top out at 2400hz because that will sync with all the interesting lower multiples.
 
  • Like
Reactions: P.Amini
The high refresh rates are a pretty big deal and they should keep getting higher for a while. This is where frame generation technology should really shine.

The real issue is the hoops required connectivity wise to drive those refresh rates. UHBR20 is mandatory for 1440p/500Hz native (same as 4k/240) with or without HDR.

I'm curious about the smart monitor mentioned here, because that honestly sounds like the same scaler technology used in their smart TVs. I would love for a reviewer to do a test of a monitor/TV with a good scaler to see if it looks as good as/better running native 1440p upscaled to 4k rather than FSR/DLSS/XeSS.
 
Last edited:
There are pretty credible sources for there being value in responses up to about 2khz. My guess is that monitor refresh rates will top out at 2400hz because that will sync with all the interesting lower multiples.
I'd be interested in study information on this. I am not even slightly aware of any method of testing this. I would assume that it would be direct eye motion sensing or some kind of direct brain scanning. No human moves fast enough or consistently enough that you can measure hand-eye response down to .0005 seconds.

I have my doubts that any human could really pick up the difference between getting one frame every .002 seconds vs every .001 seconds. I do understand that the point of getting information to executing on that information is real, but I just do not think reactions happen consistently enough that you could really measure response time from a human accurately enough to even tell the difference. I can be persuaded otherwise.
 
I'd be interested in study information on this. I am not even slightly aware of any method of testing this. I would assume that it would be direct eye motion sensing or some kind of direct brain scanning. No human moves fast enough or consistently enough that you can measure hand-eye response down to .0005 seconds.

I have my doubts that any human could really pick up the difference between getting one frame every .002 seconds vs every .001 seconds. I do understand that the point of getting information to executing on that information is real, but I just do not think reactions happen consistently enough that you could really measure response time from a human accurately enough to even tell the difference. I can be persuaded otherwise.
I don't believe it's about the human side of the equation but rather overcoming technological issues. This is an old article, but it's the first one I remember talking about very high refresh rates. It might be out of date a bit by now, but it was easy for me to find: https://blurbusters.com/blur-buster...000hz-displays-with-blurfree-sample-and-hold/
 
I think that 60hz is already beyond human perception, but there are secondary effects of transition that can affect the quality of the image, except these may get worse not better with speed - you end up in transition mode more than viewing mode.

And if you're playing realtime games and are crazy about millisecond latency, well, there you go, just maybe a couple of milliseconds faster and you can beat the competition, human or robot, it's highly unlikely but it is possible.

So a screen capable of 240hz may be a better screen even if you run it at 60hz, etc.
 
I'd be interested in study information on this. I am not even slightly aware of any method of testing this. I would assume that it would be direct eye motion sensing or some kind of direct brain scanning. No human moves fast enough or consistently enough that you can measure hand-eye response down to .0005 seconds.

I have my doubts that any human could really pick up the difference between getting one frame every .002 seconds vs every .001 seconds. I do understand that the point of getting information to executing on that information is real, but I just do not think reactions happen consistently enough that you could really measure response time from a human accurately enough to even tell the difference. I can be persuaded otherwise.
https://www.nature.com/articles/srep07861
 
Last edited:
I'm not sure your links are super relevant. They are more related to display brightness modulation, and how fast it has to be before you no longer see flicker and instead perceive a fixed brightness/shade of gray. And display technologies like stereoscopic 3D, where the display is rapidly switching between two different images and your brain has to composite them into one. Neither of these are really applicable to normal LCD/OLED displays (although they may be relevant if you're using backlight strobing/BFI).

But for typical media, as per your first source: "Traditional TVs show a sequence of images, each of which looks almost like the one just before it and each of these images has a spatial distribution of light intensities that resembles the natural world. The existing measurements of a relatively low critical flicker fusion rate are appropriate for these displays."

The "relatively low critical flicker fusion rate" being referred to is described in the article as 50-90 Hz.
 
I'm not sure your links are super relevant. They are more related to display brightness modulation, and how fast it has to be before you no longer see flicker and instead perceive a fixed brightness/shade of gray. And display technologies like stereoscopic 3D, where the display is rapidly switching between two different images and your brain has to composite them into one. Neither of these are really applicable to normal LCD/OLED displays (although they may be relevant if you're using backlight strobing/BFI).

But for typical media, as per your first source: "Traditional TVs show a sequence of images, each of which looks almost like the one just before it and each of these images has a spatial distribution of light intensities that resembles the natural world. The existing measurements of a relatively low critical flicker fusion rate are appropriate for these displays."

The "relatively low critical flicker fusion rate" being referred to is described in the article as 50-90 Hz.
The question that has to be answered, though, is what is the fastest thing to which the eye can possibly respond. I would agree that the average person won't notice the flicker at 90, and 95% are fine at 120, and 99.9% are fine at 240. But that isn't the limit of when your eyes can detect changes, just when flicker stops being a noticeable problem. For an e-sports person, what matters is where the absolute limit of the eye's ability to detect changes happens. Only when you are faster than that do you reach a point where a faster display could not confer any advantage. This research suggests that the eye chemistry tops out around 2000hz and so I expect displays to settle on 2400 hz because it aligns with many popular lower multiples (e.g. cinema & tv recorded at 24, 30, 50, 60, 120).