News Koorui to demo the 'world's first' 750 Hz gaming monitor at CES — TN panel device gets welcome color boost with the latest QD film

Aren't TN panels usually 6-bit (native)? While FRC would go a long way at 750 Hz, I do wonder about its native precision. Gamut is only one piece of the puzzle.

Contrast ratio would be another key aspect to look at.
 
I have 144hz and 75hz monitors on my gaming machine. I've tested and tried multiple times, I can't tell the difference. 30->60 I totally see it. 60->75, yes a noticable difference. 75+? When I'm actually playing and not studying the screen for a difference, nah. I'd much rather have all the eye-candy turned on myself, I do not see the point in these screens. I do not believe that 2ms vs 1.33ms for a 500-750hz screen gives a competitive advantage in the realms of human response time.
 
a noticable difference. 75+? When I'm actually playing and not studying the screen for a difference, nah.
I think it depends on a few things, like your monitor's response rate and what kind of content you're looking at. When your eyes are tracking fast moving objects on screen, the framerate makes a bigger difference than for slower movement/changes frame-to-frame.

I do not believe that 2ms vs 1.33ms for a 500-750hz screen gives a competitive advantage in the realms of human response time.
The only argument for the latency savings of high framerate displays is for players at the most elite level. If you take two pro gamers of equal skill and one has a setup with a couple milliseconds less latency, then it could conceivably give that player a slight edge. However, if there's a significant difference in abilities, a couple ms of difference in latency is unlikely to change outcomes.
 
Last edited:
I can notice a difference between 75Hz and 144Hz.
I have tried up to 180Hz, but I have a hard time seeing a difference above 120fps.
Of course, I haven't seen a 750Hz, so maybe it would be noticeable if it pumps 6x the frames.

Aren't TN panels usually 6-bit (native)? While FRC would go a long way at 750 Hz, I do wonder about its native precision. Gamut is only one piece of the puzzle.

Contrast ratio would be another key aspect to look at.
Yeah, I'm a little curious about this too.
With 1080p, 750Hz, 8-bit, it'll push 59.36Gbps of bandwidth, and only DP2.1 will display that without DSC.
Even at 6-bit, it's still pushing 44.52Gbps of bandwidth.

For reference
DP1.4= 25.92Gbps
HDMI2.1= 42.57Gbps

It might look fine with an HDMI 2.1 connection, but that high compression ratio on DP 1.4 is a little worrying for image quality.
And then there is the cable length/data integrity issue...
 
Yeah, I'm a little curious about this too.
With 1080p, 750Hz, 8-bit, it'll push 59.36Gbps of bandwidth, and only DP2.1 will display that without DSC.
Even at 6-bit, it's still pushing 44.52Gbps of bandwidth.
I'm not aware of DP or HDMI supporting only 6 bits per sample. I was talking about the underlying panel technology.

FWIW, displayspecifications.com has an extensive database of monitor specs. They separately list the interface bit depth from the panel bit depth.

For reference
DP1.4= 25.92Gbps
HDMI2.1= 42.57Gbps

It might look fine with an HDMI 2.1 connection, but that high compression ratio on DP 1.4 is a little worrying for image quality.
And then there is the cable length/data integrity issue...
Using HBR3, DisplayPort 1.3 and 1.4 can hit 32.4 Gb/s. It's annoyingly hard to find out which monitors support HBR3, unless they specifically advertise it. By now, I think most high-end DP 1.4 monitors do.
 
Last edited:
  • Like
Reactions: Notton