I'm using an LG 43UK6300PLB TV as a display for my PC, which is using a GTX 1070 card via HDMI. Originally I was having trouble getting any sort of 4k display from the PC (kept getting "no signal" on the TV) which I fixed by upgrading to a newer 4k-certified HDMI cable.
I'm still finding, however, that various games and applications suddenly either crash the display to black on startup (which locks the machine) or go to 'No Signal' again. For example, most 3DMark benchmarks fail to execute- Time Spy on the first test, Fire Strike on Graphics Test 2. If I force 3DMark to run in 1080p in its options, then the benchmarks will run correctly. Likewise, if I connect the PC to my BenQ 1080p monitor, everything works fine.
What seems to be happening is that Windows 10 is detecting the TV simply as a generic PnP display, and doesn't seem to know what resolutions it can and can't display. When selecting 1080p, for example, it often tries to set a 120mhz refresh rate rather than 60. In the advanced display settings, the check-box to 'hide unavailable resolutions' is cleared and ghosted out so it can't be checked. Then when apps try to run full-screen, some of them are trying to run in refresh rates the TV can't support and causing an issue. I've already talked to LG, who say there's no driver for the TV since it's not primarily a PC display.
Is there some way to force the generic driver to lock out the unavailable resolutions, or some other known fix to the problem? At the moment, running anything that doesn't allow me to expressively specify a refresh rate is a bit of a lottery.
I'm still finding, however, that various games and applications suddenly either crash the display to black on startup (which locks the machine) or go to 'No Signal' again. For example, most 3DMark benchmarks fail to execute- Time Spy on the first test, Fire Strike on Graphics Test 2. If I force 3DMark to run in 1080p in its options, then the benchmarks will run correctly. Likewise, if I connect the PC to my BenQ 1080p monitor, everything works fine.
What seems to be happening is that Windows 10 is detecting the TV simply as a generic PnP display, and doesn't seem to know what resolutions it can and can't display. When selecting 1080p, for example, it often tries to set a 120mhz refresh rate rather than 60. In the advanced display settings, the check-box to 'hide unavailable resolutions' is cleared and ghosted out so it can't be checked. Then when apps try to run full-screen, some of them are trying to run in refresh rates the TV can't support and causing an issue. I've already talked to LG, who say there's no driver for the TV since it's not primarily a PC display.
Is there some way to force the generic driver to lock out the unavailable resolutions, or some other known fix to the problem? At the moment, running anything that doesn't allow me to expressively specify a refresh rate is a bit of a lottery.