Dell S2718D HDR Monitor Review

Status
Not open for further replies.

LionD

Reputable
Aug 19, 2014
6
0
4,510
0
How could 8 bit/sRGB display, with contrast 1000 and no local dimming, deliver true HDR experience? Total nonsense.
 

CarbonBased

Prominent
Apr 20, 2017
5
0
510
0
@GENTLEMANGREEN

Lots of people have plenty of use for 60hz screens. stop poo-pooing products that cleary arent aimed at you. I have a rig for gaming, and sure, 60hz isn't really enough anymore. However, I take and edit photos as a hobby, so IPS, 10-bit, HDR, all very attractive features. Add that i can mate it to my photo editing laptop with a USB-C cable and were really getting somewhere. I'll be looking for this one come holiday season.
 

cbliss

Distinguished
Nov 26, 2010
5
0
18,510
0
NOT AN HDR MONITOR.. FALSE ADVERTISING.. BUYERS BEWARE!! (HDR requires 10bit panel, this is 8bit.. It also lacks any form of local dimming). Bogus product for hdr, otherwise simply an overpriced QHD monitor).
 

CarbonBased

Prominent
Apr 20, 2017
5
0
510
0
Fair enough, I didnt realize that it was 8 instead of 10 bit. But I will stand by my point that 60Hz is fine for many, if not most, computer users, even if they are gamers. the market for high refresh rates is specifically gamer-centric. Dissing product thats arent built to gamer spec because you are a gamer does not lend one to being an unbiased source of opinion.
 

Scott____67

Prominent
Jul 9, 2017
1
0
510
0
i like to wall mount my monitor anyways so the stand is non existent and in a condo it keeps areas and desk spaces clear plus having a little height with a downward pitch is perfect for the lean back in the chair gamer that i am
 

alextheblue

Distinguished
Apr 3, 2001
2,883
24
20,795
2

Agreed. A 60hz monitor isn't great for gaming anymore, so for my personal needs and budget I'm better off with a halfway decent TN panel with high refresh rate, wide freesync range, and low input lag. That might change in the future, as advanced displays come down in price. But today that's what best fits my needs.

But as you said most non-gaming applications don't need high refresh rates. Users who don't game will typically favor resolution, contrast, brightness, viewing angle, and color reproduction over refresh rate and input latency. If you have a sub-$300 budget like I do you often end up with a display that either favors gaming performance and features, or image quality and advanced colorspaces. Just because you favor a high-refresh gaming monitor doesn't mean you can't recognize uses for a non-gaming display.

Granted if you spend enough money you can get a display that doesn't compromise much and is fairly good at everything. Way out of my price range at this point, though.
 

alextheblue

Distinguished
Apr 3, 2001
2,883
24
20,795
2
To see HDR content, you’ll need a compatible player or computer with an HDMI 2.0/HDCP 2.2 output. The latest Ultra HD Blu-ray players feature this interface. You can also connect with the right video card. Fortunately, there are quite a few choices. On the Nvidia side is the GTX 950 up to the Titan X (Maxwell), or the GTX 1050 to Titan X (Pascal). AMD users can employ an R9 390X or RX 460, 470, or 480.
I thought anything with Polaris would support HDR10, such as Radeon 540/550 (Polaris 12). Maybe I'm misremembering. Also, on PC you have to have to use HDR10 compatible playback software to benefit.

On the console side of things, Xbox One S has supported Ultra HD (4K HDR10) BDs for some time. If I was looking for a dedicated box, it's a good choice even if you don't play console games. It's not much more than a decent dedicated 4K HDR10 player, and it has better support for apps. You can add a Kinect if you want voice control. If you don't use physical discs but want a dedicated box for 4K HDR streams, then I'd recommend a Roku Premiere+ or Ultra.
 

i-am-i-u-r-u

Commendable
Aug 22, 2016
7
0
1,510
0
Way over priced and phony marketing as HDR. The after calibration black luminance of .2663 and contrast ratio of 761 is pitiful.
 


I don't get why they would give this relatively high-end monitor such a mediocre non-removable stand and no VESA mounts. It might make the monitor look relatively nice when viewed from the back and side, but that's not likely to be relevant in most usage scenarios, where the back will be facing toward a wall. Making the monitor slightly thinner is largely pointless when its at the expense of functionality.

Also, the monitor's main feature seems only half-implemented. It's supposedly an "HDR monitor", but has weak static contrast ratios. Maybe it will often look better than a standard IPS screen when fed with HDR content, but a VA panel would probably look better overall, even without support for 10-bit input. This monitor reminds me of those standard-definition televisions from a decade ago that would accept an HD signal, but then downsample it to the screen's SD display, only here, we seem to be taking a high dynamic range image and displaying it on a screen with mediocre contrast. It might be a decent monitor for what it is, but it seems a bit overpriced considering what it has to offer.
 

bit_user

Splendid
Herald
I feel like I've been reading about >= 10-bit and HDR for like 10 years. HDMI has supported 30-bit (10 bits per channel) and 36-bit (12 bits per channel) "deep color" for about that long, and I thought there were supposed to be monitors that supported it.
 

jn77

Distinguished
Feb 14, 2007
580
0
18,990
2
My Still camera's record in 14bit Raw, Photoshop and Capture One work with 12 and 14bit files. My video camera's record in 12bit raw. I don't own a business and I am not going to spend $6000 on a computer monitor.

240 Htz 3D 10,12 and 14 bit panels are way over priced and need to come down. The same with OLE.

All it is is a monitor, not even a TV. It is a dead screen to display anything you throw at it.

Look at how TV's depreciate. All it is, is a system rigged to make you pay for stuff that is already obsolete and keep consumers on the hook for upgrades.
 

shrapnel_indie

Distinguished


All manufacturers tend use different formulas to calculate contrast ratio, and frequently manufacturers change their own formulas. This kind of makes contrast ratios irrelevant, unless this practice has changed and a universal formula has been agreed upon.
 

shrapnel_indie

Distinguished
Lets see...

IPS: Good
gtg refresh: 6ms - outside what is considered good for gaming (5ms or less)
Framerate: 60 fps - bare minimum to look at for gaming, and that being only for the really low budget concerns.


No, not a gaming monitor.
No, not quite a Developer's monitor.
Office monitor? Too pricey.
Executive's monitor? Now I think we have something. A monitor aimed to cater to the hipster exec who wants to look good flaunting the latest tech.
 

ceberle

Contributing Editor
Editor
Dec 20, 2012
288
0
10,780
0
I tried to be clear in the review that this monitor correctly processes HDR signals but with its low native contrast and edge backlight, it doesn't deliver an optimal HDR experience. There are plenty of televisions that offer similar performance.

The only way an LCD panel will do justice to HDR is with a zone-dimming backlight. I've recently seen demos of upcoming screens from Asus that have this feature. They look stunning to say the least. I also have the UP2718Q from Dell that has a 384-zone backlight with 1000nits, 10-bit color, and Ultra HD. That review will appear soon.

I realize the S2718D is an early effort. It'll only get better from here!

Christian
 

JTWrenn

Distinguished
Aug 5, 2008
59
0
18,640
2
Why would anyone buy this when a 32" 4k hdr monitor from samsung is right around the corner for $699? Uh850 seems like a much better deal.
 
This is a fraudulent marketing gimmick that attempts to mimic HDR. While there are no definitive IEEE-type standards for "HDR", the fact is that real HDR panels (HDTVs as well as PC monitors) have higher caliber panels. Specifically starting with being a 10-bit panel and then on to color gamut, brightness, and contrast ratio expectations.

While there are no official standards of HDR, there are three generally recognized standards of HDR: HDR10, Dolby Vision, and the newest is Samsung's HDR10+. HDR10 is the "official" supported one by the UHD Alliance and is being pushed to make as standard similar to Blu-Ray winning over HD DVD format. They put their stamp of approval as "Ultra HD Premium" on the products.

This monitor fails on everything and should not even have "HDR" in the description. Fraud if you ask me and I'm very disappointed as a long time Dell panel buyer. Otherwise the design looks great with a thin bezel and frame, but sacrificing height adjustment is inexcusable in this price range. They sacrificed style over function there and that's a fatal flaw for many.
 
Yep, VERY DISAPPOINTED with the "HDR" label. The main reason for it is to have excellent black levels for which you frankly would want at least 3000:1 likely and 10-bit support (or higher) for the better color range which helps prevent things like BANDING.
 
Status
Not open for further replies.

ASK THE COMMUNITY