News VESA reveals new performance tiers for motion clarity and HDR quality — DisplayHDR True Black 1000- and ClearMR 21000-certified devices to start ar...

I'm guessing you haven't talked to a pretty important set of high end monitor users to see if they want even brighter monitors... photographers/video editors/graphic designers. Most of them (myself included) are still using monitors sold over a decade ago, and even with those jurassic monitors, the brightness and contrast is clamped down below 20% of the monitors capability.

As an example, I use HP LP2475w monitors from 2007 on my editing rig. They have a contrast ratio of 1000:1, and brightness/nits of 400, and my brightness for my "print proof" monitor is set to 17/100. These are *still* a very very popular, true 8 bit, H-IPS editing monitors, within the communities who care about well engineered monitors, its only been a few years where monitors that eclipse these, 15 year old eizo, and NEC monitors have been eclipsed. Many of the actual 10 bit monitors released in the mid-2010s from HP and Dell, were so ungodly bright, and annoyingly would let brightness wander noticeably as they warmed up and cooled down, that pros went back to older, better engineered 8 bit monitors. Basically, these super bright, super fancy monitors youre talking about, will all require a calibration file that lightens the entire image during export or by the printer software, otherwise the too-bright monitor causes too dark prints (or video files for videographers, etc.). And i wonder how much variance VESA allows from a static setting. The only real use case for even brighter monitors than currently offered, would be in laptops, outdoor/weatherproof screens, and in vehicles. Where direct sunlight may effect it. Basically nobody who needs incredibly bright, contrasty monitors, wants them in a desktop form. And all the people who need their desktop monitors to meet stringent certifications have been begging for things like equalized brightness from edge to edge and across temperature range, consistent color distribution/balance as brightness is adjusted, and for VESA, for the love of God, to stop finding new ways to classify 6 and 8 bit monitors (looking at you, apple) as full gamut screens, with software upscaling and manipulation to color output, just so they can claim their generally below average 8 bit screens are "HDR10". I can't remember the brand off hand, it was probably 7,8 years ago now, who had upsampled 6 bit monitors claiming to have 99% sRGB coverage... maybe they did, maybe they didn't... but what was certain, was the colors being shown on the monitor were not the colors in the image file, and were virtually impossible to calibrate, because their software was obstinate about keeping that full gamut. Since it was (obviously a cheap monitor with a fancy VESA sticker on it) it didn't have manual color adjustments capable of correcting color to a professional level. So when you used a hardware calibration tool, the monitor would see that you've clipped some outer boundary of adobe or sRGB, and their software would fight you as you made corrections.

Vesa should have stuck to measuring the distance between mounting bracket screws.. it was the last time they did something consumer focused, every other thing they touch ends up being worthless marketing slop.
 
I'm guessing you haven't talked to a pretty important set of high end monitor users to see if they want even brighter monitors... photographers/video editors/graphic designers. Most of them (myself included) are still using monitors sold over a decade ago, and even with those jurassic monitors, the brightness and contrast is clamped down below 20% of the monitors capability.

As an example, I use HP LP2475w monitors from 2007 on my editing rig. They have a contrast ratio of 1000:1, and brightness/nits of 400, and my brightness for my "print proof" monitor is set to 17/100. The alternative is a calibration file that lightens the entire image during export or by the printer software, otherwise the too-bright monitor causes too dark prints (or video files for videographers, etc.). The only real use case for even brighter monitors than currently offered, would be in laptops, outdoor/weatherproof screens, and in vehicles. Where direct sunlight may effect it. Basically nobody who needs incredibly bright, contrasty monitors, wants them in a desktop form. And all the people who need their desktop monitors to meet stringent
 
I decided to waste 5 minutes of my life I'll never get back, actually reading VESA's press release. They actually said these new monitors ridiculous brightness ( offering higher than 1000 nit) will bring in people working in creative endeavors. These idiots are willfully ignorant. And the testing they're proposing does nothing at all to improve on the flaws that make every photographer who buys a Mac studio monitor from throwing it away after a year or two of losing fights over color accuracy...

They don't even attempt to correct the primary flaw of OLED monitors, the one that every color accurate monitor manufacturer can explain in 10 seconds... OLED monitors burn themselves out after several thousand hours. Specifically the blues fade and turn yellow. With cheaper monitors it'll happen after 2-4000 hours of use. But even the most expensive will suffer after 5k hours. I'll attach a picture of my IPS proofing monitors current backlight hours, to put it into perspective... a monitor I last had to adjust after a calibration when I upgraded my GPU in 2021, good IPS monitors don't drift from their settings, while OLED monitors become impossible to calibrate, and won't produce accurate colors, becoming e-waste to a photographer, after 2-4 years of 8 hours a day usage.

Yep. 22,400 hours of use on this IPS monitor. I wonder why there's no endurance test for these new certifications...
VESA does a lot of things, including the DisplayPort standard. Unlike HDMI, that's royalty-free.
Wait, lmao... you just went through that whole ethering of VESA and earnestly thought "well, they obviously dont know about display port, lemme fix that". I don't even particularly mind display port, the betamax of A/V cables. I use them on the (better for color accuracy than VESA's new highest standard ever) 2007 HP monitors. And yes, I do sleep better knowing they're not charging the exorbitant idk, I think 10 cents a device HDMI charges. Anyways, VESA is a bit like FIFA, or the NFL, of whichever industry funded umbrella org you prefer. You may love watching the world cup or the superbowl, and cheer on a new rule you think is great... but they don't give two tosses about the customers happiness, they have one job. To make the members who fund it, more money, any good idea is thoroughly coincidental. In this particular case, by saying there's a whole new, never before seen standard of excellence in OLED monitors, that you'll NEED to buy for several hundred to a thousand bucks... while knowingly burying how the ability of those monitors to even meet the standards that earned them that fancy sticker, will be measured in months. Very likely under 5k hours, and being worthless e-waste by 8k hours max, while being completely non-color accurate. Even the standards they released only test that each shade the monitor produces, meets a luminance mark, not the correct frequency/wavelength. That tells you everything you need to know about how unserious they are. Or how all testing is done at one static temp, after a full warmup is completed... while the biggest problem with monitors not designed for professional work, is how as little as several degrees C Can throw throw the color/brightness calibrations wildly off. So yeah, an extremely useless certification, with testing that only stresses panel refresh rate and that they meet the absurd brightness levels nobody has ever asked for from desktop monitors. Every single brand new Eizo pro monitor, with the exception of their prettyidiotic $28k idiot tax (the CG1 prominance has a brightness of 1000, but I have never ever seen one used by a working photographer, it's the monitor version of a track day only hypercar), every monitor they produce at the moment for creatives/color accurate production has a brightness of 350-450. And those are still several thousand dollars (and all IPS).
 
OLED monitors burn themselves out after several thousand hours. Specifically the blues fade and turn yellow. With cheaper monitors it'll happen after 2-4000 hours of use. But even the most expensive will suffer after 5k hours. I'll attach a picture of my IPS proofing monitors current backlight hours, to put it into perspective... a monitor I last had to adjust after a calibration when I upgraded my GPU in 2021, good IPS monitors don't drift from their settings, while OLED monitors become impossible to calibrate, and won't produce accurate colors, becoming e-waste to a photographer, after 2-4 years of 8 hours a day usage.

Yep. 22,400 hours of use on this IPS monitor. I wonder why there's no endurance test for these new certifications...
Thanks for affirming my decision to stick with IPS, when I bought a new monitor last year. No, I don't need color accuracy, but I do fear monitor burn-in & discoloration, as I tend to keep them a long time and put in a lot of hours on them. At least I tend to run my monitors at lower brightness/contrast settings, which also seems more comfortable for my eyes.

I do sleep better knowing they're not charging the exorbitant idk, I think 10 cents a device HDMI charges.
Not only that, but the HDMI Forum has decided that features >= v2.1 cannot be implemented in an open source driver.

That seems to have a lot to do with them wanting to prevent unlicensed implementations from ripping off these features by studying source code in open source drivers for other hardware products.

Also, I wouldn't assume the license fee is flat. It could vary by the device type/parameters and especially might increase with newer HDMI versions. This could go some ways to explain why the industry seemed to have stalled out at HDMI 2.0, for so long.
 
10k nits peak brightness TV monitors will become a mainstream in a year or two. Already sold in 4k. That is what it means to be almost life-like realistic and look fantastic. Next 20k nits for highlights. And 130-160" size. In 8k. That will be better than life-like.

Though for my typical everyday and better say everynight usage when programming my old 1600 nit 8k QLED monitors are on 2/50 backlight brightnes :)

Somebody mentioned OLED here? I don't see them. Specifically after RTINGS "Longevity and burn-in updates And Results From 100 TVs". Just remember one year contains approximately 10k hours and more than a half of that time your monitor is ON
 
Last edited:
10k nits peak brightness TV monitors will become a mainstream in a year or two. Already sold in 4k. That is what it means to be almost life-like realistic and look fantastic. Next 20k nits for highlights. And 130-160" size. In 8k. That will be better than life-like.

Though for my typical everyday and better say everynight usage when programming my old 1600 nit 8k QLED monitors are on 2/50 backlight brightnes :)

Somebody mentioned OLED here? I don't see them. Specifically after RTINGS "Longevity and burn-in updates And Results From 100 TVs". Just remember one year contains approximately 10k hours and more than a half of that time your monitor is ON
Oh yeah! Now it can show arc welding so life-like that I need my welding goggles to watch it!
 
  • Like
Reactions: bit_user
Oh yeah! Now it can show arc welding so life-like that I need my welding goggles to watch it!
While right now when you see welding (or wire or direct sun) in your screen you convince yourself: "No, this is not an empty gray OLED screen, this is welding"