News Samsung's new monitor sets OLED refresh rate record of 360 Hz — thanks to AI-driven algorithm

Status
Not open for further replies.

atomicWAR

Glorious
Ambassador
Kill burn in completely or at least close and then I'll be interested. LED for me til then...though I have heard some strides are being made in pixel burn-in reversal which is nice to hear. Pair that with AI and we may get some OLED panels I'd feel comfortable with as I expect ten years out of a TV set or monitor (one and the same to me at this point as 43" minimum gamer...75" in the living room)
 
  • Like
Reactions: Order 66

SCP2000

Great
BANNED
Dec 8, 2023
31
25
60
Even in 2023, burn-in gets blown way out of proportion. If it were problematic, it wouldn't have become mainstream for PC use, and there's always ways to mitigate the risk. Only have HDR enabled for HDR content; don't have your brightness settings cranked at max 24/7. Hide your taskbar and desktop icons if you're not putting your monitor to sleep when you're stepping away from your PC for longer than 30 minutes, etc. The average person replaces their monitors every 5–6 years, their televisions every 10, so provided you're not being careless, by the time burn-in poses any significant risk, you're probably going to be looking at upgrading to something more "current".

Depending on how the reviews go, the PG32UCDM could be my next big purchase, but I typically like waiting for Dell/AW to release their version due to better warranty coverage. QC appears better too.
 
Nov 24, 2023
13
7
15
Kill burn in completely or at least close and then I'll be interested. LED for me til then...though I have heard some strides are being made in pixel burn-in reversal which is nice to hear. Pair that with AI and we may get some OLED panels I'd feel comfortable with as I expect ten years out of a TV set or monitor (one and the same to me at this point as 43" minimum gamer...75" in the living room)
I have used my lg cx for 3 years, no burn in. Screen is on at least 12 hours a day 5 days a week. I was worried when I bought it. Haven't noticed any burn in, and I check regularly.
 
Ai seems to be the latest buzz word. There's no way to incorporate actual AI into a monitor because of the processing power required for AI to function.
If anything it's just buzz words for existing technology they relabel as "AI"

Can't wait for AI mice and AI keyboards, maybe some AI RGB...
My toaster has a top of the line analog AI system, it can very accurately count the seconds until it is done :)
 
  • Like
Reactions: Order 66

atomicWAR

Glorious
Ambassador
I have used my lg cx for 3 years, no burn in. Screen is on at least 12 hours a day 5 days a week. I was worried when I bought it. Haven't noticed any burn in, and I check regularly.
Thanks for taking the time to post your personal results. Those mean a lot to me as I know you don't have a horse in the race.

When I bought my current monitor mid 2021...OLED panel testing had come in from the previous gen(s) and at that time things didn't look all that good. Testing of 16 hours a day for a year-> two (not far off my usage levels) frequently still showed strong burn-in issues on many brands with static images (task bar/news or sports tickers).

I had panels in the past that were susceptible to burn in. So I know all the tricks of running a icon free desktop, slide show for the back ground and auto hiding the task bar...etc. It just gets a little tiring is all, especially when I let someone else use my system who doesn't know the ins and outs of how I have it set up.

Blue OLED failures and burn in were the common though not universal issues (2020 panels or older and obviously 2021+ stuff still hadn't been tested fully due to time constraints) when I was looking at panels. Like I said I do know they have since made huge strives and I have heard AI may actually fully solve the burn in issues that haven't been fully addressed with current tech designed to reverse pixel color retention (also impressive these days) at the heart of burn in which is really nice to hear. I read something the other day about varying the color intensity or briefly shutting off stagnate pixels/changing their tones slightly. But I haven't heard of much progress in extending the life of the blue OLEDs which has been a long time issue. Granted that is more of a 'long term' thing but with everything else, it is hard to ignore.

I'll be sure to continue to keep an eye on the tech and reviews by regular users on Tom's. Even on Tom's I have heard horror stories about panels being bad in 12-24 months (even recent models) but I am happy to see more users like yourself who are having an optimal experience showing great progress in the OLED tech.

Question, do you do anything to prevent burn in (like techinques /I mentioned, run anti-burn in programs or algorithms) or do you just cross your fingers and hope for the best? If it is the later, it is really great to hear. Users like us on Tom's who know to do certain preventives, everyday users, not so much. Thanks for your time and response!!
 

SCP2000

Great
BANNED
Dec 8, 2023
31
25
60
I have used my lg cx for 3 years, no burn in. Screen is on at least 12 hours a day 5 days a week. I was worried when I bought it. Haven't noticed any burn in, and I check regularly.
Been using a Sony A8H for three years with zero issues. People simply parrot nonsense they've read online and accept it as gospel without conducting their own due diligence. It's comical.
 

SCP2000

Great
BANNED
Dec 8, 2023
31
25
60
OLEDs comprise less than 1% of monitor sales, not sure that qualifies as mainstream
Yikes. By your logic, only ultra-high-refresh rate TN panels qualify as mainstream because they are affordable to the masses. By this same logic, 4090s aren't mainstream; only the 4060s and 4070s are, because they represent the majority. It's mainstream if it's sold at a large-box retailer, and the product/tech isn't in its infancy, so you can stop grasping at those straws.

Even ultrawide monitors are considered mainstream.
 
  • Like
Reactions: Order 66

bit_user

Titan
Ambassador
Ai seems to be the latest buzz word. There's no way to incorporate actual AI into a monitor because of the processing power required for AI to function.
They mean that it uses neural networks, as opposed to some conventional image processing algorithms. Neural networks are superior at heuristic-based processing (i.e. problems without a closed-form solution), which tends to come up a lot in image processing and computer vision.

As for as the feasibility of their claim, I think it's certainly plausible. We'd need more details on exactly where/how they're using it, but it might not actually be inferencing each pixel, for instance. The AI could simply be used to compute per-pixel gain values that are updated at a lower frequency.

Even if you were inferencing all the pixel values, 4k @ 360 Hz is just 8.96 billion RGB channel values per second. In the era of phone SoCs packing 10's of TOPS, such processing power would equate to 1000 computations per R, G, and B channel value.
 
  • Like
Reactions: helper800

bit_user

Titan
Ambassador
OLEDs comprise less than 1% of monitor sales, not sure that qualifies as mainstream
I think a more salient point is that most display manufacturers only started shipping OLED-based models in 2023, unless I'm mistaken. That suggests a lack of long-term usage data that I find troubling.

I have been "in the market" for a good monitor for about 15 years. For a long time, I waited for OLED to happen. It kept not happening, and about 5 years ago I decided to get a HDR + VRR LCD monitor. It took until this year to find one that finally seemed to tick all the boxes.

There are three reasons I didn't go with OLED:
  1. I felt it was too new to really know how much the burn-in issue had been solved. I'm not interested in bending over backwards to avoid burn-in - I just want a monitor that I can use as a normal desktop display, without undue concern. Withstanding 5 years' use @ 12 hours/day is my minimum requirement.
  2. The subpixel pattern in most OLED monitors currently on the market is reportedly bad for text, which is most of what I use my monitors for.
  3. OLED prices are still high enough that I'm not going to pay that much unless it's truly the perfect monitor and I know I can use it for at least 5 years.

So, after waiting all these years, I decided to wait probably 5 more.
 
  • Like
Reactions: atomicWAR

plusev

Distinguished
BANNED
Sep 14, 2014
81
31
18,560
I think a more salient point is that most display manufacturers only started shipping OLED-based models in 2023, unless I'm mistaken. That suggests a lack of long-term usage data that I find troubling.

I have been "in the market" for a good monitor for about 15 years. For a long time, I waited for OLED to happen. It kept not happening, and about 5 years ago I decided to get a HDR + VRR LCD monitor. It took until this year to find one that finally seemed to tick all the boxes.

There are three reasons I didn't go with OLED:
  1. I felt it was too new to really know how much the burn-in issue had been solved. I'm not interested in bending over backwards to avoid burn-in - I just want a monitor that I can use as a normal desktop display, without undue concern. Withstanding 5 years' use @ 12 hours/day is my minimum requirement.
  2. The subpixel pattern in most OLED monitors currently on the market is reportedly bad for text, which is most of what I use my monitors for.
  3. OLED prices are still high enough that I'm not going to pay that much unless it's truly the perfect monitor and I know I can use it for at least 5 years.

So, after waiting all these years, I decided to wait probably 5 more.
You have been in the market for 15 years and never once considered EIZO?

Burn-in is no longer an issue for televisions in 2023. If individuals want additional insurance, Best Buy and Costco both offer burn-in protection if they're afraid about a low-probability incident. For desktop use, obviously you should take additional precautions, however this is generally not an issue. Still, it won't prevent folks from complaining merely for the sake of complaining. These are the same people who, by the way, have never owned an OLED screen in their lives.

Hope that helps :)
 

Findecanor

Distinguished
Apr 7, 2015
330
229
19,060
Ai seems to be the latest buzz word. There's no way to incorporate actual AI into a monitor because of the processing power required for AI to function.
If anything it's just buzz words for existing technology they relabel as "AI"
For me, if a product description mentions any "intelligence" or "smart", then that's a big red flag steering me away from that product.

And I don't want the image to be processed. I want it to be as accurately represented as possible.
 

Peksha

Prominent
Sep 2, 2023
46
35
560
unnamed-jpg.3424368

Samsung QD-OLED in premium Dell Alienware AW3423DW

(c) original post
 
  • Like
Reactions: bit_user

bit_user

Titan
Ambassador
You have been in the market for 15 years and never once considered EIZO?
I wanted GSync-compatible + FreeSync Premium Pro. HDR600-level, but without array backlighting (sorry, I care a little about HDR, but not enough to deal with array backlighting artifacts). DisplayPort needs to be HBR3, so that you can get at least 144 Hz with 10-bit enabled. 10-bit is nice for combating banding, plus I dabble in some graphics programming and the idea of playing with 10-bit and HDR appeals to me.

I almost bought the Dell S2721DGF, but people said its colors looked off, due to not having a SRGB mode.

Burn-in is no longer an issue for televisions in 2023.
Yeah, I would already consider an OLED television. However, I only use my TV about 1/10th as much as my computer monitor.

Hope that helps :)
Thanks!
 

SCP2000

Great
BANNED
Dec 8, 2023
31
25
60
unnamed-jpg.3424368

Samsung QD-OLED in premium Dell Alienware AW3423DW

(c) original post
Interesting.

Still, how often will someone start a discussion or write a review if there are no issues versus someone who has problems? People have a tendency to focus on the negatives, it's human nature, and it only ends up distorting public perception in the long term. The reality is, there are thousands of people who are satisfied for every individual who has an issue, whether it's down to negligence or a manufacturing defect, and quite often it's the former. People think they can treat an OLED display like their old NEC CRTs.
 
  • Like
Reactions: helper800
Thanks for taking the time to post your personal results. Those mean a lot to me as I know you don't have a horse in the race.

When I bought my current monitor mid 2021...OLED panel testing had come in from the previous gen(s) and at that time things didn't look all that good. Testing of 16 hours a day for a year-> two (not far off my usage levels) frequently still showed strong burn-in issues on many brands with static images (task bar/news or sports tickers).

I had panels in the past that were susceptible to burn in. So I know all the tricks of running a icon free desktop, slide show for the back ground and auto hiding the task bar...etc. It just gets a little tiring is all, especially when I let someone else use my system who doesn't know the ins and outs of how I have it set up.

Blue OLED failures and burn in were the common though not universal issues (2020 panels or older and obviously 2021+ stuff still hadn't been tested fully due to time constraints) when I was looking at panels. Like I said I do know they have since made huge strives and I have heard AI may actually fully solve the burn in issues that haven't been fully addressed with current tech designed to reverse pixel color retention (also impressive these days) at the heart of burn in which is really nice to hear. I read something the other day about varying the color intensity or briefly shutting off stagnate pixels/changing their tones slightly. But I haven't heard of much progress in extending the life of the blue OLEDs which has been a long time issue. Granted that is more of a 'long term' thing but with everything else, it is hard to ignore.

I'll be sure to continue to keep an eye on the tech and reviews by regular users on Tom's. Even on Tom's I have heard horror stories about panels being bad in 12-24 months (even recent models) but I am happy to see more users like yourself who are having an optimal experience showing great progress in the OLED tech.

Question, do you do anything to prevent burn in (like techinques /I mentioned, run anti-burn in programs or algorithms) or do you just cross your fingers and hope for the best? If it is the later, it is really great to hear. Users like us on Tom's who know to do certain preventives, everyday users, not so much. Thanks for your time and response!!
I have had my 55 inch LG CX OLED for about 1.5 years now with no burn-in to report. It is mounted above my monitors and is used maybe 10-20 hours a week to play Nioh 2, Monster hunter world, and other pretty games in HDR. I also use it for another 20 or so hours a week for youtube content. Both games have no options to hide their HUD and have experiences zero burn-in with HDR always on when playing with the brightness at 75 and the OLED brightness at 80. The only blemish my LG CX has is a dead pixels exactly on the edges of the panel on the right side and 2 dead pixels not directly on the edge in the very bottom right. I am pretty sure none of the dead pixels were there when I initially setup and looked at the panel, but either way even from 5 feet away its nearly impossible to see them. I got mine at costco which came with a 5 year extended warranty that covers both dead pixels and burn-in, so if either happens to a meaningful amount, ill get a new panel through RMA.
There are three reasons I didn't go with OLED:
  1. I felt it was too new to really know how much the burn-in issue had been solved. I'm not interested in bending over backwards to avoid burn-in - I just want a monitor that I can use as a normal desktop display, without undue concern. Withstanding 5 years' use @ 12 hours/day is my minimum requirement.
  2. The subpixel pattern in most OLED monitors currently on the market is reportedly bad for text, which is most of what I use my monitors for.
  3. OLED prices are still high enough that I'm not going to pay that much unless it's truly the perfect monitor and I know I can use it for at least 5 years.

So, after waiting all these years, I decided to wait probably 5 more.
I just want to start this off with saying all your points are completely valid, and reasonable. These are just my thoughts as someone who has been using an OLED daily for a while now.

1. I thought the same, but I decided to see how it would be and its much less of an issue IF you have this as a secondary screen, not as the primary display. Only put stuff on it that you want to see in all is OLED glory, and don't leave static elements on for hours at a time, and you would be fine.

2. I felt this would be an issue as well, and it is to some extent. Since my OLED TV is 4k on a 55 inch the text is so big, compared to the amount of pixels it contains, it's hard to see the issues posed by the W-OLED subpixel layout. This is also an operating system failing. With proper support for the W-OLED subpixel layout and the triangular QD-OLED layout, it looks as clear as traditional RGB or BGR layouts.

3. They are trending downwards at least. Maybe in another 5-10 years we will seen sub 500 dollar OLED panels.
 

TJ Hooker

Titan
Ambassador

bit_user

Titan
Ambassador
For me, if a product description mentions any "intelligence" or "smart", then that's a big red flag steering me away from that product.
I agree that it's a potential warning sign. I would want to know more about what they think makes it smart.

And I don't want the image to be processed. I want it to be as accurately represented as possible.
I'm sure getting the most performance out of a display panel involves certain multi-parameter optimization problems. Neural networks are very good at solving those. To the extent that it's just trying to do that, being "smart" doesn't necessarily mean it's not also accurate.

Testing is the way to know how accurate it is. I suggest not worrying about how the thing is marketed, and just worry about the testbench results + the impressions and experience of a professional reviewer you trust.
 

gg83

Distinguished
Jul 10, 2015
759
355
19,260
Ai seems to be the latest buzz word. There's no way to incorporate actual AI into a monitor because of the processing power required for AI to function.
If anything it's just buzz words for existing technology they relabel as "AI"

Can't wait for AI mice and AI keyboards, maybe some AI RGB...
AI 5G
 
  • Like
Reactions: Order 66

bit_user

Titan
Ambassador
What exactly is smart about this monitor and how does it enable 360hz where it otherwise wouldn't be able to hit it?
They don't really have an incentive to spell it out, in detail. I did a quick patent search, but didn't find anything obvious.

Here's a QD-OLED explainer, I found.


It goes into lots of details that aren't necessarily relevant to QD-OLED, but there's a part near the end which could reveal some of the challenges:

Having your (OLED) cake and eating it, too

Blue OLED material — the light source of QD-OLED displays — is a notoriously tricky substance to work with.

Much like other OLED materials, there’s a three-way trade-off between lifespan, brightness, and efficiency. Generally speaking, any time you prioritize one of these attributes, the other two suffer. Drive an OLED pixel hard enough to produce the brightness you want and you not only diminish its life expectancy but also its efficiency.

But QD-OLED displays may prove to be the exception to this rule. By using three layers of blue OLED material per pixel, each layer can share the brightness burden.

My guess is that it might have something to do with how to balance the distribute current between the layers, and/or things like balancing overdrive vs. panel life, etc.
 
  • Like
Reactions: Order 66
Status
Not open for further replies.