Question Can a graphics card kill a monitor?

Sep 26, 2021
I got a brand new Intel NUC 11 performance kit (NUC11PAHi7) and added RAM and a drive. When I plugged it into my Asus monitor (VS247) via HDMI, I saw the NUC logo and did not get into the boot menu in time. So I restarted it and the monitor would no longer pickup the signal from the NUC's HDMI port. I tried to plug the monitor back into a laptop that it was plugged into prior and it still would not pick up the signal. I tried the same cable with an HDMI to DVI adapter and it worked with the old laptop. Then I tried it with the NUC and it would not pick up the signal. Then I tried with a new cable and still nothing. I then tried it with the old laptop and HDMI to DVI adapter and not no signal!

I have an identical monitor and tried the same sequence and that one is now dead. Two of the three ports on both monitors now show no signal when connected to the NUC and the old PC (where they were functioning for years).

The NUC has an HDMI 2.0 port and I also tried with a USB-C to dual HDMI splitter and nothing works.

I have not been able to reset the monitors (power off/on, reset all, Etc).

Any idea if the video adapter in the NUC could really cause two monitors on two ports to fail? It is hard to believe but four ports, three cables, two monitors, and two machines later and the results were pretty repeatable. Just not sure if I missed something or it there is a way to confirm these results? Has anyone else ever experienced this or anything similar?

Thanks for reading...


The only way I can imagine video outputs frying monitors would be as kerb wrote: abnormally high voltage on the ports' 5V power which is normally only used for EDID/I2C. In that case, the monitor may be repairable by cloning the remaining port's EDID chip to replace the fried ones.

The LVDS lanes are AC-coupled to allow both ends to use different termination bias voltages without destroying each other and should easily survive a temporary 20+VDC fault.