• Find an incredible deal for Black Friday or Cyber Monday? Share those epic bargains with the community by posting them in this forum thread!

What Is HDR, And What Does It Mean For Your Monitor?

Status
Not open for further replies.

JonDol

Reputable
Nov 30, 2015
129
0
4,680
0
I'm wondering if Thomas Soderstrom has drunk too much and wrongly written his name ? If not then welcome on THW :)
 

InvalidError

Titan
Moderator
"The goal, then, is to represent the desired color and brightness with as few bits as possible while also avoiding banding. For SDR. this was 8 bits per color"

I disagree: banding is already quite visible at 8bits in the SDR spectrum when you have monochromatic color scales or any other sort of steady gradient across a screen large enough to make the repeated pixel colors readily noticeable. It may have been deemed good enough at the time the standard was defined back in the days of CRT SD TVs where nearby pixels blended into each other to blur boundaries and resolution wasn't high enough to clearly illustrate banding. It isn't on today's FHD LCDs where pixel boundaries are precisely defined and an 8bits gradient across 1920 pixels produces clearly distinguishable bands roughly eight pixels wide.
 
Feb 27, 2018
4
0
10
0
In your monitor reviews that support HDR, are you going to showing:

1. Color charts for results you are able to achieve during calibration.

2. Support for automated calibrations using Cal MAN program (running an automated calibration that stores the results in the monitor)
 

Bill Milford

Reputable
Aug 17, 2015
2
0
4,510
0
This article is not about just HDR (High Dynamic Range), but it is about two technologies, HDR and Deep Color (Extended Gamut i.e. the bigger triangle) and their interaction. The article starts off talking about Deep Color and I was wondering why the article's title wasn't about Deep Color.
 

plateLunch

Commendable
Mar 31, 2017
23
0
1,510
0
@TJ Hooker
No, the author has it right. Though should be talking about the "longest" wavelength of light, not the lowest.

Light frequency scales typically get drawn as a bar graph with longer wavelengths (lower frequencies) on the left and shorter wavelengths(higher frequencies) on the right. Depending on how the scale is annotated, I can see how one might end up saying "lowest wavelength". But the idea is correct.
 

TJ Hooker

Glorious
Herald

Still a pretty poor way to phrase it, given that the meanings of "longest" and "lowest" could almost be considered opposite. To me saying red has the "lowest" wavelength would imply that red's wavelength would be a lower number, which is obviously wrong. I don't think it makes any sense to refer to it as "lowest" just because it may appear on the left side of a graph.
 

InvalidError

Titan
Moderator

With 6bits monitors, you'd see 64 bands in a gradient instead of 256 and those would be clearly distinguishable at much lower resolutions.

Banding was already a thing 20+ years ago on VGA CRTs with good enough sharpness to make it stand out.
 

lorfa

Distinguished
Mar 30, 2012
113
0
18,680
0
Would this have any effect on the alpha channel bits used by the OS? Or would an 8 bit alpha channel still be sufficient? With an 8 bit alpha channel, that would be 12*3=36 + 8 = 44 bit, such an ugly number. (I know this doesn't get sent to the display, but still curious.)
 

Sal_Monella

Prominent
Jun 30, 2017
1
0
510
0
@PLATELUNCH No, the author does not have this statement correct. It is objectively wrong to say that a 700 nm wavelength (red light) is lower than a 390 nm wavelength (violet light). Red light has a lower frequency than violet light and frequency is inversely proportional to wavelength. It is common for the visible light spectrum to be graphically represented with wavelength on the x-axis, increasing with distance from the origin.
 

chyaweth

Distinguished
Feb 12, 2011
15
0
18,510
0
Hmmm? Very interesting but the hardware and software to display HDR etc. is at the end if the line so to speak. What about the camera, or whatever, to capture the required level of detail?
 

InvalidError

Titan
Moderator

Not really a problem: image sensors with 12+ bits resolution have been around for several years already. Professional equipment needed the extra bits a long time ago because having a wider dynamic range and resolution means more chances of salvaging shots/takes that have exposure issues during post-processing. On the color side of things, that usually gets reworked in post-production anyway to compensate for a bunch of stuff from changes in lighting conditions and different cameras to the director's artistic vision.
 

bit_user

Splendid
Herald

I remember when I discovered banding was even discernible @ 8-bits, on a decent, properly-adjusted CRT computer monitor. I was pretty astonished, but I had made the test pattern myself and could clearly see many of step boundaries where they were supposed to be.

Video commonly uses a smaller range for luma/chroma: 16-235 / 16-240. Conversion between this and 0-255 RGB can exacerbate the effects of banding.

BTW, if we used a linear gamma, banding (and other noise) in dark areas of the image would be much more noticeable than it is.
 

bit_user

Splendid
Herald

Although some employ dithering to make it less noticeable.

Plasma TVs are notorious for dithering.
 

bit_user

Splendid
Herald

First, you'd still see the improvements of increased dynamic range.

Second, even looking at two monitors or TVs with significantly different color gamuts, side by side, should still be apparent to you. Whether you have 2-color or 3-color vision, increasing the color gamut should have the effect of making what you see on the monitor that much closer to what you see in real life*.

* There are some people (mostly female, IIRC) with 4-color vision. To them, any 3-color display technology will always seem a bit "flat". At the extreme is the Mantis Shrimp, which can see in something like 12-color vision.

https://en.wikipedia.org/wiki/Mantis_shrimp#Eyes
 

bit_user

Splendid
Herald

Games are potentially an immediate beneficiary. Major game engines have used HDR rendering + software tone mapping for more than a decade.

High-end professional video production used 10-bits per channel as far back as the mid/late 1990's (though 8-bits was far more common). Even back then, the standard for digital post production of film was 12-bit log or 16-bit linear, although that was more about avoiding the introduction of artifacts during the process.

Even today, not all 4k content is true HDR. This seems like a pretty good resource, if you're curious about specific titles.

http://realorfake4k.com
 

Crashman

Polypheme
Editor
I'm wondering if Thomas Soderstrom has drunk too much and wrongly written his name ? If not then welcome on THW :)
It's Swedish, and with that dish, you can figure out how to decipher it.

I've been in the forums since at least 2001 if not earlier and have been writing for Tom's Hardware since 2005, so maybe I should be welcoming you ;)
 
Last edited:
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS