News Most-Anticipated Gaming Monitors of 2023: 500 Hz, OLED, Wide Screen

Neilbob

Distinguished
Mar 31, 2014
254
331
19,720
Pushing to the Extreme for the 0.1% of people how'll make use of "500Hz" monitors.
On the other hand, completely useless for regular consumers and organizations.

And that 0.1% of people will only be convincing themselves that they can actually tell a difference. Maybe 1% of that 0.1% can really see it.

500Hz may be a somewhat impressive technological accomplishment, but the only real purpose it has is to give marketing people even more power over those gullible individuals who have more money than sense.
 

blacknemesist

Distinguished
Oct 18, 2012
490
98
18,890
Well at least my G8 bought for about 25% off is still unchallenged : great on every aspect, except the space it takes off your desk, and not 32'' or similar offers for 4k. Really surprised that we have either 27'' or 49'' for 4k coming but nothing in between, either that or I missed something, it's not even about 49'' being better or not, it is just that not every one even has space on a task for that beast.
 

oofdragon

Honorable
Oct 14, 2017
314
286
11,060
So.. we are near the ultimate gaming monitor.. Dual 4K, OLED, 480Hz.. but, I mean, would someone notice the difference side by side with this 240hz mini led, even playing esports or watching movies? Maybe this is it already.. the end game monitor
 
D

Deleted member 14196

Guest
Capitalism doesn’t give you what you need. They advertise giving you what you need but they’re just trying to sell you something. You don’t have to buy it and you don’t need it. Embrace, minimalism, and be happier.
 

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
762
1,760
would someone notice the difference side by side with this 240hz mini led

OLED
-best image quality
-best black levels
-best contrast
-best viewing angles
-best response time

The only thing that beats OLED's image quality are high-end cinema projectors from Barco. A projected image will always be nicer to look at than staring directly into a light source like a screen. But since even the cheapest cinema projectors will costs you at least $140,000, OLED is the next best thing.

Mini LED
-highest brightness

Mini LED is for people who want very high brightness for HDR and are willing to trade in a lot of image quality for it. They're probably the same people who temporarily blind everyone on the road with their super bright LED headlights.

I think HDR looks disgusting and is just a stupid fad. There are lots of people who actually complain that they can't turn the brigtness down enough on their Mini LED HDR TV.
 
Last edited:

CraigN

Distinguished
I'm pretty excited for these OLED monitors, but I want to know what the hell happened to all those 1440p Mini-LED models @ 300Hz with a G-Sync module Nvidia and partners announced at CES 2022. They seemed to have all just dissipated into thin air and never released.
 

DavidLejdar

Respectable
Sep 11, 2022
268
158
1,860
OLEDs come with HDR too.

HDR has a clear advantage over SDR. Its 10-bit depth (or more) means (at least) 1.07 billion colors, compared to 16.7 million colors with the usual 8-bit depth of SDR. And the dynamic range difference is 6 stops for SDR and at least 13 stops for HDR.

This means, simplified, that e.g. during a scene with the sun rising, HDR can provide a way more realistic depiction of all the various shades coming with it, while SDR may provide only a "sunray blur", which upon closer inspection especially in 4K may have visible rough edges. And when this gets pushed up to 8K and ultra-wide or large, then it will be even more visible where SDR falls short.

There are several factors for HDR to work properly though. On one hand, there is the actual content, such as a video game. And if the content creator just moved some slider to be able to claim it supports HDR, then that isn't necessarily a great implementation thereof when the final image is pretty much just overexposed, and similarly. And i.e. HDR10 isn't as good as HDR10+.

And on the other hand, it is about the panels at hand. E.g. OLEDs come in at below 1,000 nits, which is not that great to make full use of HDR. Then again, as said, OLEDs can support HDR and can do that quite well actually, but they don't that well for more than short bursts, at least as far as TVs were concerned, with OLED screens for gaming being quite new. And then there is also the issue of how long until it has a burn-in. And Mini LEDs sure can be too bright, especially when used in a dark room, and perhaps not calibrated at that.

But aside from the topic of which type of panel may be the best for what, and at what cost, I don't think everyone will drop HDR to go back to "SDR only".

EDIT: Oh, and watching a video saying "HDR" doesn't necessarily mean one can see it all well on a non-HDR screen, in particular due to the issue of how many colors there are in the video and how many colors a viewer's screen actually has.

And in particular the recording of HDR gameplay can have some caveats in itself apparently. I.e. checking how well Metro Exodus: PC Enhanced Edition runs with a RX 6700 XT OC, recording software said it supports HDR recording, and I set the parameters for it, but the recording didn't look good, (perhaps as the software tried to convert it to SDR for general use as e.g. Twitch doesn't support HDR etc., or perhaps because it works suboptimal at below 4K, or something). And I suppose I could look into how to make it work better (with different software), but that would take me back to the previous point, of that not everyone may have a screen with as many colors anyhow. So recording without HDR now, and long story short, just meant to say that the gameplay itself doesn't look worse with HDR on, and I would likely see more of a boost from it with a HDR10+ screen.
 
Last edited:
  • Like
Reactions: drivinfast247

Sleepy_Hollowed

Distinguished
Jan 1, 2017
532
217
19,270
The highest refresh rate monitors do have their use cases, but that's quite the niche.

I'd love the bigger better performing ones for work though, especially the better color-range ones.

Let's hope the prices do come down too.
 

jasonelmore

Distinguished
Aug 10, 2008
626
7
18,995
Nobody trying to catch alienware and Samsung with their Q-OLED tech? I was hoping we'd see a 32" 4K 120-240hz Q-OLED From Asus this year. The Alienware 1440p model is the only one on the market and it got tons of great reviews. It's a shame nobody else is adopting this groundbreaking panel technology>
 

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
762
1,760
Nobody trying to catch alienware and Samsung with their Q-OLED tech?

Because Samsung's QD-OLED suffers from artefacts and is inferior to regular OLED. I would argue it's inferior to a regular IPS panel.

The only thing QD-OLED does is having higher brightness for HDR.

That extra brightness for HDR comes at the cost of image quality. People don't seem to realise that increasing the brightness of a screen will cost you in other areas.

Samsung had to order their RGB pixels in an overlapping triangle structure to increase brightness.

(they're actually changing blue light through "quantum dots" to create brighter red and green, since blue light is typically stronger, regardless)

The result of this weird pattern where red and green overlap with blue, is that QD-OLED has green and red artefacts, you can see these artefacts especially on monitors with text and icons.

sdfsfsfsfsfsfff.jpg


sfsfsfsfff.png


sdfsfsfsfsfsff.png
 
Last edited:
  • Like
Reactions: oofdragon

Friesiansam

Distinguished
Feb 9, 2015
307
182
20,170
And that 0.1% of people will only be convincing themselves that they can actually tell a difference. Maybe 1% of that 0.1% can really see it.
None can see it, the human vision system is just not fast enough. These extreme high refresh rates are just a marketing tool, to sell very high profit margin monitors, to well-off, gullible gamers.
 

DougMcC

Reputable
Sep 16, 2021
162
116
4,760
None can see it, the human vision system is just not fast enough. These extreme high refresh rates are just a marketing tool, to sell very high profit margin monitors, to well-off, gullible gamers.

Human vision sensitivity has been verified scientifically out to about 800hz. After that I'll definitely agree that 'none can see it'. At least directly. The indirect effects do matter though. Human vision may have its limits, but it operates relatively continuously rather than discretely, which means you can potentially take in information as soon as it is present on the screen. The faster the refresh, the sooner it is present on the screen.
 

COLGeek

Cybernaut
Moderator
Human vision sensitivity has been verified scientifically out to about 800hz. After that I'll definitely agree that 'none can see it'. At least directly. The indirect effects do matter though. Human vision may have its limits, but it operates relatively continuously rather than discretely, which means you can potentially take in information as soon as it is present on the screen. The faster the refresh, the sooner it is present on the screen.
Do you have a specific source regarding the "Human vision sensitivity has been verified scientifically out to about 800hz" comment? If so, please share.
 

TheOtherOne

Distinguished
Oct 19, 2013
235
80
18,670
OLEDs come with HDR too.

HDR has a clear advantage over SDR. Its 10-bit depth (or more) means (at least) 1.07 billion colors, compared to 16.7 million colors with the usual 8-bit depth of SDR. And the dynamic range difference is 6 stops for SDR and at least 13 stops for HDR.

This means, simplified, that e.g. during a scene with the sun rising, HDR can provide a way more realistic depiction of all the various shades coming with it, while SDR may provide only a "sunray blur", which upon closer inspection especially in 4K may have visible rough edges. And when this gets pushed up to 8K and ultra-wide or large, then it will be even more visible where SDR falls short.

There are several factors for HDR to work properly though. On one hand, there is the actual content, such as a video game. And if the content creator just moved some slider to be able to claim it supports HDR, then that isn't necessarily a great implementation thereof when the final image is pretty much just overexposed, and similarly. And i.e. HDR10 isn't as good as HDR10+.

And on the other hand, it is about the panels at hand. E.g. OLEDs come in at below 1,000 nits, which is not that great to make full use of HDR. Then again, as said, OLEDs can support HDR and can do that quite well actually, but they don't that well for more than short bursts, at least as far as TVs were concerned, with OLED screens for gaming being quite new. And then there is also the issue of how long until it has a burn-in. And Mini LEDs sure can be too bright, especially when used in a dark room, and perhaps not calibrated at that.

But aside from the topic of which type of panel may be the best for what, and at what cost, I don't think everyone will drop HDR to go back to "SDR only".

EDIT: Oh, and watching a video saying "HDR" doesn't necessarily mean one can see it all well on a non-HDR screen, in particular due to the issue of how many colors there are in the video and how many colors a viewer's screen actually has.

And in particular the recording of HDR gameplay can have some caveats in itself apparently. I.e. checking how well Metro Exodus: PC Enhanced Edition runs with a RX 6700 XT OC, recording software said it supports HDR recording, and I set the parameters for it, but the recording didn't look good, (perhaps as the software tried to convert it to SDR for general use as e.g. Twitch doesn't support HDR etc., or perhaps because it works suboptimal at below 4K, or something). And I suppose I could look into how to make it work better (with different software), but that would take me back to the previous point, of that not everyone may have a screen with as many colors anyhow. So recording without HDR now, and long story short, just meant to say that the gameplay itself doesn't look worse with HDR on, and I would likely see more of a boost from it with a HDR10+ screen.
It's all good for lit up and bright scenes that look even shiner and showcase all the brightness. But I really really dislike HDR when it comes to the mixture of dark scenes with moving and appearing/disappearing lights. It becomes almost unwatchable.

I posted a while ago here to check if there was a solution but no luck and ended up disabling HDR on my Monitor. Here's the little video I recorded with my phone showing how bad it looks on my Monitor.
https://streamable.com/nxda0d
 
Jan 14, 2023
2
4
15
None can see it, the human vision system is just not fast enough. These extreme high refresh rates are just a marketing tool, to sell very high profit margin monitors, to well-off, gullible gamers.
Perhaps with an impulse driven CRT display. Unfortunately, with sample and hold displays we need much higher refresh rates to get the same motion clarity of a CRT.
 

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
762
1,760
Its 10-bit depth (or more) means (at least) 1.07 billion colors,

You're confusing color depth with what is called "color gamut".

Color depth is simply a container. You could use it to store individual colors, you could also use it to store greyscale values, or use it to store just 1 color. You can not determine the color gamut a screen can display by looking at the color depth it supports.

This means, simplified, that e.g. during a scene with the sun rising, HDR can provide a way more realistic depiction of all the various shades coming with it, while SDR may provide only a "sunray blur"

SDR is a "sunray blur" and HDR a "realistic shade" is it? The ambiguous nature of what HDR is supposed to be is just one of its many problems.

99% of the content you watch on a monitor is in the sRGB color gamut, easily understood by any monitor. Enough colors to depict pretty much anything. We've used it for years, and it works incredibly well, it is a well understood gamut by anyone in the imaging business.

I can pick two random sRGB monitors, calibrate them, and they will look pretty damn similar.

HDR is not like this because it needs to rely on so called wide color gamuts like DCI-P3 and plenty of others that all want to become the "HDR standard". No one agrees what HDR should look like. The so called HDR standards are not standards at all, they are ambiguous. How you translate color seen by a spectrometer to data for a HDR gamut is a free-for-all by companies. And since printed images have limited gamuts (that's why we have blackpoint compensation), they can not be used as a reference for HDR.

I can pick two random HDR screens, try to calibrate them, and the colors will looking nothing alike.

There's the question how you create this "HDR" image. If you want a higher dynamic range, the only way to do this is to make monitors incredibly bright, and to get there you need to use blue light, because only blue light carries enough energy to do this. But then there's the problem that you run into color inaccuracies, color temperatures that are off, samsung HDR that suffers from color fringing etc. You trade in color accuracy for HDR.

Then there's the question if all that blue light is even healthy. We know Blue LED are unhealthy for the retina. We know they damage the retina. And now companies are making very high nits HDR screens, where the bright light is coming from blue light. Regardless if you take that blue light through a filter or "quantum dots" to change its hue, that light still carries the energy of blue light and is potentially damaging people's retina.

I personally go out of my way to avoid HDR. Both the HDR content, and HDR screens, can go take a hike. Give me a good OLED and I'm happy. I don't need a 2,000 nits screen, you don't go stare into the bright sky in real life either. Because high energy light using the blue wavelength as a carrier, damages the retina, that's why fishermen have so many eye problems.
 
Last edited: