It's usually handled by the software, so cpu, or GPU if the software is hardware accelerated (to the best of my knowledge). You're only outputting to 1080p anyway, the output resolution doesn't change to 4k to be downscale by the display.
It should actually look a bit better than native 1080p, especially if we are talking about video from streaming services, which tends to be heavily compressed to save bandwidth. Youtube, for example, puts relatively heavy compression on their 1080p video, making it appear somewhat blurry when played full-screen on a 1080p display. Increase the resolution of the stream to 1440p on supported videos, and it will look substantially sharper. Set it to 4K, and it can look a bit sharper still. Native 1080p on Youtube actually looks more like 720p in terms of detail, so it's usually good to watch videos at a higher-than-native resolution if you care about image quality and have bandwidth to burn.
For something like a Blu-ray though, you are less likely to see much of an improvement, since they are not compressed nearly as much. Even a 1080p Blu-ray typically offers a higher bit-rate than a video streamed at 4K on Youtube.
I have discovered this with blurry youtube videos too.
-Do you have a clue why youtube does this if the videos look blurry afterwards? (compression). Dont they know it looks blurry?
-How is the youtube video quality on phones? Can people watch 4k on phones too?
It should actually look a bit better than native 1080p, especially if we are talking about video from streaming services, which tends to be heavily compressed to save bandwidth. Youtube, for example, puts relatively heavy compression on their 1080p video, making it appear somewhat blurry when played full-screen on a 1080p display.
But for full res video with minimal compression, the difference is mostly illusory. When a studio downsamples a 4k video to 1080p for distribution, they do it using a high-quality encoder which takes on the order of 2-10 hours per hour of video to downscale, figuring out the best way to mash 4 pixels of data down into 1.
When you play a 4k video on a 1080p screen, the image is downscaled in software, or a scaler built into the GPU. These are designed to work as quickly as possible since they have to work in real-time. It's done by a generic algorithm which doesn't at all consider what exactly the image is trying to show.
The reason it looks sharper is an optical illusion. Your eyes have special cells which sense edges (borders of light vs dark areas). They trigger when they see a border, and your brain gets excited that it's seeing an edge. That's how unsharp masking works - it actually destroys information in the image, but exaggerating the borders between light and dark make your brain think the image is sharper and thus better looking. (The sharpness setting on your TV does unsharp masking.)
Downsampling a 4k video to 1080p with a simple scaler results in a slight false sharpening effect. The image quality is actually slightly worse than 4k video downsampled to 1080p properly with a good encoder. But because of the false sharpening, the edge detectors in your eyes get your brain all excited into thinking it's seeing a better image. If you applied a slight unsharp mask to the studio-produced 1080p video, it would appear slightly better than the 4k video downscaled to 1080p in real-time. (This is why your TV has a sharpness setting. Video is transmitted in a format intended to convey the most information, but that ends up looking slightly blurry. The TV adds a bit of unsharp masking to make the picture "look good" even though it's destroying a bit of info in the process.)
Of course, if your video player doesn't allow you to add a bit of unsharp masking, you may in fact prefer the downscaled 4k video over a properly encoded 1080p version of the video.
Youtube does it to save bandwidth. Last I checked, somewhere around 100 million hours of video are viewed on the site every day. They need a huge amount of server resources to stream all that video to people, so they cut down on video quality to keep the amount of data they need to transfer more manageable. As a mostly ad-supported service, it might not be profitable for them to increase their bit-rate substantially for everyone by default. I don't believe the bitrate is much better on other streaming services like Netflix either.
Plus, keeping the bandwidth requirements lower is good for many viewers too. Those with slower Internet connections might not be able to stream video smoothly otherwise. And some people have data caps on their internet, where streaming video at a higher quality could cause them to hit that limit quickly.
As for streaming video at a higher-than-native resolution on phones, I'm not so sure. I think that might be limited with mobile web browsers, though some apps may allow it. The size of a phone screen is a lot smaller though, so the blurriness may not be as noticeable there.
Well yeah, that's why I mainly focused on streaming services and added that you won't see much of an improvement with something like a Blu-Ray, where compression artifacts shouldn't be much of a problem. : P