Question HDMI to VGA converted display shows static after booting and has to be re-plugged to return the display to its normal state ?

Dec 19, 2022
3
0
10
Hi. I need help figuring out which hardware is faulty. I have here an MSI mobo with a Ryzen 5 procie and no dedicated GPU, just the integrated. The issue is that every time the pc boots to the desktop, it suddenly shows static and the desktop can be barely seen behind the static noise that is happening on my monitor. The static has a single color in it, it can be sometimes yellow, sometimes green, and other colors as well but it never shows 2 or more colored static at the same time. I have to unplug the VGA cable that's attached to the HDMI to VGA adapter or unplug the adapter itself and replug it to return the display to its normal state.

The Static Display issue doesn’t occur at the boot loading stage where it displays the mobo branding which is in my case the "Pro Series" loading screen with the dark background, it only starts to occur when the pc has booted to the desktop successfully. Also sometimes, there is no signal detected by the monitor at all, and is also fixed by re-plugging in the display cables attached to the CPU. Also, unplugging the cable from the monitor doesn't fix it, it has to be unplugged from the output of the adapter or the output of the CPU.

I tried changing to an adapter of a different brand and surprisingly the issue doesn’t occur after a successful boot but instead will randomly occur at any time. I also tried swapping monitors but still takes VGA as an input, and the same problem still occurs.


Preview 1 : https://ibb.co/hFPr57H
Preview 2 : https://ibb.co/SdwmGMq
 
The BIOS will work at some standardized VESA setting. Booting into an o/s will cause an attempt to read the plug-n-play description from the monitor to configure (EDID data). This is done by the DDC wire of the HDMI. When you cut that wire by using a VGA adapter you lose automatic plug-n-play configuration and must set everything blindly. Some people will try to tell you that VGA had EDID added, but if it exists, (A) your monitor will also have HDMI and/or DisplayPort or DVI-D (which have that wire). The part many people leave out is that the original EDID protocol has long since changed to EDID2. In theory they are backwards compatible, but from what I can tell, modern graphics cards (or their drivers) do not actually support the original EDID standard. If it did, then your operating system would have been able to automatically configure for settings that worked.

The old "driver disk" you would get with a VGA monitor was not really a driver. What it contained was a standardized database the driver could use to tell what settings are valid. Without that you can manually and blindly set up for using that setting. Usually that means first setting up on an HDMI monitor for some lower resolution setting, rebooting with the VGA monitor (knowing that standard setting can be used), and then manually setting the desired resolution and scan speed interactively.

VGA adapters basically suck. Don't use them if you don't have to. If you do require them, first set for the lowest resolution you can run at and still see manual settings. Then reboot with the VGA monitor.
 
  • Like
Reactions: HorizonX007
Okay that makes a lot of sense thank you so much for the explanation :). Is it safe to say that my cpu and mobo isn't defective?

I couldn't say. Odds are that you would have problems due to the VGA adapter whereby the driver cannot automatically configure. There could still be a problem, but I doubt it. If you are able to use an HDMI or DisplayPort based monitor, and it works, then figure the problem is the fault of the adapter. If HDMI or DisplayPort monitors fail, then there is some other problem. It's about 99% likely the adapter plus VGA monitor combination is at fault, but it isn't guaranteed as the only problem.