I am running a Windows 7 based PC for video and photo editing.It has amongst other things an Nvidia GTX770 GPU to speed things up and is plugged into my 4K Samsung Tv set via using an HDMI cable.Now I sort of assumed that I would get a proper 4K picture and video signal sent to the telly and would be looking at all the data from the original image and video files.
Then I read that even with a fancy GPU Windows 7 still can't handle anything bigger than 1080p.So I'm guessing that what happens is that my GPU is fed 4K image data that it then re-samples down to 1080p.It's then passed along the HDMI cable to the telly,which then re-samples it up by guessing at the missing pixels to make an image that will fill a 4K screen.The telly is set in the GPU control panel to mirror what's on my 1080p PC monitor.
This of course gives an image that is of lower quality than the original.If I load the same video or still images onto a USB flash drive and plug it into a USB port on the telly,the improvement in quality is obvious to even an untrained eye as it's now not throwing anything away and then having to guess what is missing to build the full screen image.
So do I understand this correctly and it's Windows 7 that is preventing my PC from sending a full 4K signal to my telly,or is there more to it than this and there isn't some hidden setting on my GPU I can activate to get round this problem?
Then I read that even with a fancy GPU Windows 7 still can't handle anything bigger than 1080p.So I'm guessing that what happens is that my GPU is fed 4K image data that it then re-samples down to 1080p.It's then passed along the HDMI cable to the telly,which then re-samples it up by guessing at the missing pixels to make an image that will fill a 4K screen.The telly is set in the GPU control panel to mirror what's on my 1080p PC monitor.
This of course gives an image that is of lower quality than the original.If I load the same video or still images onto a USB flash drive and plug it into a USB port on the telly,the improvement in quality is obvious to even an untrained eye as it's now not throwing anything away and then having to guess what is missing to build the full screen image.
So do I understand this correctly and it's Windows 7 that is preventing my PC from sending a full 4K signal to my telly,or is there more to it than this and there isn't some hidden setting on my GPU I can activate to get round this problem?