4k UHD vs. HDCP: A Netflix yak shaving story

May 6, 2021
7
1
15
Hi. Don't know if this is the right place for this story (apologies if it's not), but I really need to post it somewhere.

So it all began when I bought a 4k monitor for my PC.
Naturally, I wanted to watch videos in 4k, because some of them are breath-taking.

So I upgraded my Netflix subscription from basic to all-inclusive 4k one, downloaded the windows 10 app, made sure the monitor is set to 60Hz, set the streaming quality to high and browsed the titles for "4k".
I was met with the following image under the title:
ZSt4VG7.png


So the HD badge clearly stated that the video is in HD and not UHD 4k, as I expected it to be.
After some head scratching, I contacted Netflix support. They directed me to the same list of prerequisites I already checked and, after some interrogation, admitted that Netflix does not have a tool that would comprehensively tell me which boxes I need to check on top of the ones I already have.
In other words: their (ultra sleek and simple) software won't tell me what it wants from me.

So I tried different configurations, unplugged stuff, plugged in other stuff, you know, the basics. I even carried my 20kg PC down to the large flat screen TV to discover that the TV had no issues playing 4k on Netflix, when my monitor was plugged in.

Searching the internet I came across some articles from people, who already walked this path and found out that I need HDCP.
What's HDCP? It's High-bandwidth Digital Content Protection. And your monitor and graphics card need to support it for Netflix to play UHD on your monitor.
My Nvidia control panel proudly stated that the graphics card and the display are HDCP capable:
5cxKjtN.png


So after some more head scratching, I download CyberLink's Ultra HD Blu-ray Advisor and it say that actually, no, the HDCP 2.2 (GPU/Display) is not available!
I asked for second opinion of ArcSoft BD & 3D Assistant, which corroborated the story, stating that the connector type is not HDCP compliant.

After some more tinkering, unplugging stuff and plugging other stuff in, I found out that using a DisplayPort cable for my monitor (LG 43UN700-B) is a no-no and only the HDMI cable on the third and fourth port will give me HDCP-compliant 60Hz connection.
Of course the specification sheet for this monitor doesn't mention that (or if it does, I'm way too dumb to find it).

So I now have the monitor on HDMI and the ArcSoft and CyberLink's software both agree that the connection is HDCP compliant.
So why does Netflix disagree? Why do I still only have the HD badge under my videos?

Enter even more tinkering, searching, unplugging stuff and plugging other stuff in and I found out that the problem was that I have more than one monitor!
I bought a new monitor, I got two old ones and an Nvidia 1080, so I'm not going to throw the other ones away, am I?
I suppose the whole setup is not HDCP compliant because I could start the video on one monitor and then drag the application on the device with a splitter or a recorder or something.

So there's two things I needed to do there:
  1. Unplug both side monitors.
  2. Restart the Netflix application.
Failing to do either results in the badge remaining in HD and Netflix refusing to stream 4k.
But when I only have the primary monitor plugged in, and I started Netflix fresh, I finally have the coveted UHD badge under some of the titles:
Yje7sBB.png


Now. Do I really want to crawl behind my computer every time I want to watch a movie? No, of course not, let's do it programatically.
Quick google search gave me this tip, which has a comment that says, use win+P and select "PC screen only".
Great right? I select PC screen only and it will turn off the secondary and tertiary monitors and the setup will be HDCP compliant!
Wrong. The PC screen is determined by the monitor enumeration. You know how you get numbers assigned to your monitors in your display settings?
yFPpuFU.png


So if your primary monitor is 2, then the only thing you can do is rearrange your outputs so that your primary is the first one.
Except my Asus graphics card unfortunately has the following hierarchy: DVI > DP > HDMI. And it only has 2 HDMI outputs. And I can't have the primary on the DP, since it fails the HDCP check.
Apparently you can't modify the monitor enumeration and I'm not the only one that tried, according to this thread and there was another thread on nvidia forums, which I unfortunately can't find anymore.

So I can forget about windows' built in solution for turning off monitors. So let's look at the Multi Monitor Tool from the tip I mentioned earlier.
After installation I found that it does indeed work in turning off my monitors. In principle. After turning them back on, the coordinates were all screwy and I needed some additional effort to fix them.
Not going to that every time I want to watch a movie, am I? Although there is an option to save monitor profiles, that mostly works for this use case, BUT!

But what about windows that I have open? Doing either windows' native or Multi Monitor Tool methods will throw the open programs onto my primary and windows won't bother to put them back when the side monitors are plugged back in.
Which means a good amount of additional manual work after I'm done with the movie.
Luckily, there's people that have dealt with similar situations when docking and undocking their laptops and the solution is DisplayFusion Pro (wish I'd known about it six years ago). And it even kinda sorta works!
So I threw 20€ at them and they gave me a license and it turns out, it also has monitor profiles that can achieve the same thing as Multi Monitor Tool, so I don't need two separate applications for this!

Fantastic, so now I'm set. I found a viable solution. And it only took me three days, 20 bucks and a wole lot of confusion. So what if I now need to manage two profiles for DisplayFusion and so what if I have a lightshow every time I select either monitor profile? I got 4k videos on Netflix in return, right? And it only takes like half a minute for my display to stop having a stroke after disabling and re-enabling any of them!
All of this because of HDCP. The protocol that accomplishes nothing and benefits no one. The protocol that will burn some energy and produce no valuable result. [redacted]

P.S.: since we're on the topic of netflix and their HD/UHD 4k badges, when you finally have the 4k badge, it doesn't necessarily mean that all of the episodes in a series will actually be in 4k. Do CTRL+SHIFT+ALT+D and see if "Playing bitrate" is 3840x2160:
GL2zS01.png

I found that the latest episodes of Blacklist are in 1080p :)

P.P.S: I just found out about win+P and "Second screen only", which apparently properly selects my primary screen and disables both secondaries. Although the open application problems are still there so DisplayFusion stays.
 
Last edited by a moderator:
Welcome to the forums, newcomer!

Out of everything that's been said and done, I'm curious to learn what the make and model of your Nvidia GPU is, the version for your OS and the make and models of all your panels.
 
GFX cards:
PCI Express 3.0 x16: Asus ROG Strix-GTX1080 Series (The top one, which the monitors are plugged into)
PCI Express 3.0 x16: Asus ROG Strix-GTX1080 Gaming (in SLI, bought used)

OS:
Edition Windows 10 Pro
Version 20H2
OS build 19042.928
Experience Windows Feature Experience Pack 120.2212.551.0

Monitors:
LG 43UN700-B (Main)
AOC U2879VF
Asus VA249HE

A general mishmash of stuff I got on discount.
I already tried this dance with the AOC panel, but gave up after a day, so the above is really an accumulation of experiences in the span of a few months (on and off) and across two monitors.
 
  • Like
Reactions: Lutfij
@JakartaJames
I'm just curious, but are you on an CPU with an Intel iGPU that supports 4k60Hz? If so, have you enabled multi monitor display out in bios and tried running your video playing programs on that for UHD 4k? It should be able to display out of the GTX 1080, but if not you would still need to hook up another display cable to the iGPU output. It could be a potential easier solution to the issue.

Open Windows 10 display Settings>Graphics Settings and add your video players as either classic apps or Universal apps and then open each one to change the program to use the iGPU as the power saving GPU instead of your main GTX 1080 for high performance.

Also, last I knew, you have to use Microsoft Edge for Netflix 4k on Windows 10 without the Netflix app if you wanted to try that instead.
 
This has been an ongoing issue with Netflix and other streaming services for a good while. A large portion of the incompatibility is directly related to efforts to keep people from ripping video (pirating) at that resolution via the PC they are using to stream. It's been ongoing for years and each time they come out with a list of prerequisites there is always some magic something standing in the way.
For many folks who 'think' they are streaming 4K Netflix (and other apps) from a PC to a monitor, it's really just scaling 1080 and fooling the TV into showing that it's 4K. I would suggest that if you are sitting further then a couple of feet from your TV, you wouldn't know anyway.
 
@JakartaJames
I'm just curious, but are you on an CPU with an Intel iGPU that supports 4k60Hz? If so, have you enabled multi monitor display out in bios and tried running your video playing programs on that for UHD 4k? It should be able to display out of the GTX 1080, but if not you would still need to hook up another display cable to the iGPU output. It could be a potential easier solution to the issue.

Open Windows 10 display Settings>Graphics Settings and add your video players as either classic apps or Universal apps and then open each one to change the program to use the iGPU as the power saving GPU instead of your main GTX 1080 for high performance.

Also, last I knew, you have to use Microsoft Edge for Netflix 4k on Windows 10 without the Netflix app if you wanted to try that instead.
I am, yes, i7 8700, I just tried hooking up the main to the integrated card and applying the setting in the graphics preference, but it's the same problem. Just HD - probably because the rest of my stuff is still plugged in and enabled. I also want to play some games at some point so I have the primary hooked up to both outputs and switching between the two is a bit of a hassle and also both are active at the same time in windows, which gives me 4 desktops, one of which I can't see 😀
I dunno, maybe it's possible to get it working that way with some more effort, but I'm just gonna stick with what I have for now.
The windows 10 app is legit though, it's also stated as supported in Neflix's instructions and 4k does work in it. The reason why I'm not using Edge is that it sometimes cripples my system when playing Netflix. Just out of the blue the screens can stop working or the keyboards stops responding and the video stops, so I stopped using it.