News HDR10+ PC Gaming Coming in Upcoming Samsung Showcase

Status
Not open for further replies.

Blitz Hacker

Distinguished
Jul 17, 2015
61
22
18,565
They lost me @ F2P. All those games end up pretty much garbage. I think League is the only F2P game that is decent, provided you can deal with the toxic community.
 

txfeinbergs

Prominent
Mar 22, 2023
24
34
540
They lost me @ F2P. All those games end up pretty much garbage. I think League is the only F2P game that is decent, provided you can deal with the toxic community.
They lost me at HDR10+. What a useless standard created just because Samsung was unwilling to play ball with Dolby and had to do their own thing. I have yet to buy a Samsung TV as a result of them not supporting Dolby Vision.
 
They lost me at HDR10+. What a useless standard created just because Samsung was unwilling to play ball with Dolby and had to do their own thing. I have yet to buy a Samsung TV as a result of them not supporting Dolby Vision.
samsung did it with amazon
but its opensourced format, no royalty, dolby vision has like 2500$ annual fee for content creators xD

hdr10+ adopters got quite bigger, so not only samsung TVs supports it, panasonic, toshiba, TCL and few others
 

bit_user

Polypheme
Ambassador
For such a relatively old standard, I'm surprised it seems to be so uncommon. I've been shopping for a HDR monitor for at least the past 4 years, and I don't remember seeing the term. Even a few of the latest & greatest HDR gaming monitors I checked don't mention it. Is that because it's so common that it's not a differentiator, or because it's really that rare?

It gives me bad memories of HDMI Deep Color support, which nothing really seemed to use. Even though my TV and PS3 both had it, there was never any indication of it being used.

On a related note, I managed to buy one blu-ray that had xvYCC and used my PS3 to play it on my TV. I could believe the color gamut was better, but I'd have probably needed to see it side-by-side with standard BT.709 to appreciate the difference.
 
Last edited:
For such a relatively old standard, I'm surprised it seems to be so uncommon. I've been shopping for a HDR monitor for at least the past 4 years, and I don't remember seeing the term. Even a few of the latest & greatest HDR gaming monitors I checked don't mention it. Is that because it's so common that it's not a differentiator, or because it's really that rare?

It gives me bad memories of HDMI Deep Color support, which nothing really seemed to use. Even though my TV and PS3 both had it, there was never any indication of it being used.

On a related note, I managed to buy one blu-ray that had xvYCC and used my PS3 to play it on my TV. I could believe the color gamut was better, but I'd have probably needed to see it side-by-side with standard BT.709 to appreciate the difference.
most HDR stuffs are HDR10 or HLG, even windows is just plain HDR10, nvidia did driver support for both hdr10+ and dolby vision, amd is a little behind
but still, how many dolby vision monitors are there? still even if dolby vision is on paper better than hdr10+, theres not ia single panel which fully support dolby vision in its full range, 10k nits? 68B colors? hmm on paper it looks cool

hdmi deep color is just RGB full (full being PC monitor norm, limited being TV norm), on TV content you shouldnt notice difference as thats RGB limited content, but on PC it should be noticeable
 
  • Like
Reactions: bit_user

scottslayer

Prominent
Feb 8, 2023
58
62
610
The currently active Crossplay Beta will have run for about a year when the Open Beta comes out.
All they are doing is taking away the formality of clicking the apply for invite button on Steam.
 

emike09

Distinguished
Jun 8, 2011
165
159
18,760
They lost me at HDR10+. What a useless standard created just because Samsung was unwilling to play ball with Dolby and had to do their own thing. I have yet to buy a Samsung TV as a result of them not supporting Dolby Vision.
HDR10+ and Dolby Vision are pretty much the same thing with tiny differences that don't matter. For example, while DV supports a 12-bit color depth and HDR10/10+ supports 10-bit, it's nearly impossible to tell the difference for moving content. Running 12-bit requires more processing power on both the GPU and the display. Professional color grading experts grading for final releases would want 12-bit during the grading process, but even in the theater, nobody is going to notice the difference if the final output is 10-bit or 12-bit. Even 8-bit looks good, as long as it's not dithered 6-bit to 8-bit conversion, that's sloppy and looks terrible and new cheap TV's are still selling with that.

The important thing is that HDR10+ is open source and requires no loyalties to use. I'm all for open source. While Dolby Vision is better, by specs, you're just paying Dolby extra for something that doesn't matter in the real world. 8k displays are sharper than 4k displays, but literally every AV professional will agree that you don't need 8K for anything, and when watching content between the two, at a normal viewing distance, you cannot tell the difference.

Ultimately, what separates "HDR" from HDR10+ and DV is dynamic metadata. This is where no game has ever come forward as a feature. Being able to dynamically and automatically adjust HDR content based on that very particular moment is a revolutionary addition to the HDR world of gaming, and something I very much welcome.

An example of standard HDR in PC gaming is RDR2. No dynamic metadata. It's not the greatest implementation of HDR and requires proper tuning to get just a basic decent scene, but it's not the worst either. Some scenes absolutely shine with realism, brilliant bright skies, excellent color and contrast. Other scenes are muddled, grey, too dark, too bright, with terrible contrast. This is where HDR10+ or DV would fix things, but DV requires licensing and a display that supports it, and most devs would say "hard pass".

One size does not fit all when it comes to HDR, and that's where HDR10+ and DV's dynamic metadata step in.
 
Last edited:

emike09

Distinguished
Jun 8, 2011
165
159
18,760
"...and that starts with The First Descendant - a third-person looter shooter"

Third person?? Yawn.

Carry on.
Many of us prefer 3rd person. Especially on big displays or if story and charactor development is the heart of content. This being F2P, I don't see the story being big, but there's no need to hate on a good 3rd person game. Unless the game is just bad.
 
For such a relatively old standard, I'm surprised it seems to be so uncommon. I've been shopping for a HDR monitor for at least the past 4 years, and I don't remember seeing the term. Even a few of the latest & greatest HDR gaming monitors I checked don't mention it. Is that because it's so common that it's not a differentiator, or because it's really that rare?
I'm not aware of any monitors supporting HDR10+ or Dolby Vision which I've found interesting. They all seem to be limited to the VESA HDR specs which only mandates HDR10.
 
  • Like
Reactions: bit_user

SyCoREAPER

Honorable
Jan 11, 2018
853
318
13,220
HDR10+ lost. I had to face that realization when I had my Samsung TV.



Dolby Vision won and it's only a matter of time before it's available on monitors/for gaming now that this kickstarted things.



I'm not aware of any monitors supporting HDR10+ or Dolby Vision which I've found interesting. They all seem to be limited to the VESA HDR specs which only mandates HDR10.
Even then a lot of monitors fall short of proper HDR unless you spend bit money.
 
Last edited:

bit_user

Polypheme
Ambassador
It is but I'm concerned about their burn-in. Quite a few news outlets (and individuals) have faced burnin with early adaption units
Yes, me too. That's one reason I'm sitting out this round of OLED monitors.

Is what you heard in regards to the latest generation of OLED monitors, or do you mean "early" in the sense of models from prior years?
 

SyCoREAPER

Honorable
Jan 11, 2018
853
318
13,220
Yes, me too. That's one reason I'm sitting out this round of OLED monitors.

Is what you heard in regards to the latest generation of OLED monitors, or do you mean "early" in the sense of models from prior years?
Early specifically to mini-led.

Id even be terrified of running a regular OLED for gaming. My C1, which is obviously an older generation of OLED panels. It is amazing but if I leave the home screen with my apps panel open, I do get ghosting that goes away but if I get ghosting just from that short duration, imagine a HUD if you play multiple hours a day, every day.
 
  • Like
Reactions: bit_user

SyCoREAPER

Honorable
Jan 11, 2018
853
318
13,220
There have supposedly been improvements, but I wonder how well they truly solved the problem. There are now plenty of OLED gaming monitors, so I guess we'll know in a couple years' time.
It's definitely improved but it's not burn-proof.

Hopefully it's not been implemented as a form planned obsolescence at this stage and truly has improved enough to have some longevity.
 
There have supposedly been improvements, but I wonder how well they truly solved the problem. There are now plenty of OLED gaming monitors, so I guess we'll know in a couple years' time.
The only thing they can really do is even out the wear and minimize the impact of usage. No matter what OLED is going to go dim or have burn in over time. Micro-LED is the replacement technology if they can viably make panels at scale.
 

bit_user

Polypheme
Ambassador
Hopefully it's not been implemented as a form planned obsolescence at this stage and truly has improved enough to have some longevity.
If it withstands like only 2 years of moderate use, I'm sure we'll hear a lot of complaints. On the other hand, if a monitor holds up to 5 years of heavy use, that should be enough.

I usually keep my monitors longer than that, but I think 5 years would be an acceptable lifespan.
 

SyCoREAPER

Honorable
Jan 11, 2018
853
318
13,220
If it withstands like only 2 years of moderate use, I'm sure we'll hear a lot of complaints. On the other hand, if a monitor holds up to 5 years of heavy use, that should be enough.

I usually keep my monitors longer than that, but I think 5 years would be an acceptable lifespan.
Agreed.

Where this will be vital will be laptop. If they botch the laptop market with 2 year life spans or less, they will smear the entire landscape.

Price is also going to be a hurdle. I can get a nice big @ss TVz almost 2 TVs, for the price of a smaller but good monitor. 🤷

Hoping for the best.
 
Status
Not open for further replies.