[SOLVED] Monitor doesnt work without HDMI

Jun 25, 2019
5
0
10
So, my motherboard recently died so i bought new one. My dedicated GPU wouldnt work after that or i though so. I got new monitor and keep rock on my integrated GPU. I decided to try plug again my dedicated GPU and my everything was ok. Since it is 144hz i unplug my HDMI and I plug in vga with hdmi adapter on top. I didnt get any picture. I did that again, this time with DVI extension and it didnt work aswell. When i tried my extension on my integrated GPU everything work just fine. But my dedicated GPU works only with HDMI cable and since it is ordinary HDMI, i cant have 144hz refresh rate. Did somebody had same problem? I tried everything, raplced VGA cable, tried both extensions (VGA TO DVI and VGA TO HDMI), but it only works on ordinary HDMI
 
Solution
Which GPU are you using? What monitor? What adapters?

Do you get 120Hz from the HDMI?

From the info available:
  • you have a monitor, capable of 144Hz that has HDMI and DVI?
  • You have a GPU with VGA and HDMI?
So, your GPU has (at best) HDMI 1.4, which in 99% of implementations means 144Hz is not achievable. 120Hz should be possible, assuming monitor is designed with HDMI in mind.
In all likelihood, DVI-D is what you'd need to use to achieve 144Hz.

As for your adapters, they only (typically) work in one direction. A DVI-VGA adapter is not, by default, a VGA-DVI.

So, do you have a VGA to DVI, or DVI to VGA adapter? If it's merely passive (which it likely is), then it's DVI-I to VGA. DVI-I carries analog signals, so...
Which GPU are you using? What monitor? What adapters?

Do you get 120Hz from the HDMI?

From the info available:
  • you have a monitor, capable of 144Hz that has HDMI and DVI?
  • You have a GPU with VGA and HDMI?
So, your GPU has (at best) HDMI 1.4, which in 99% of implementations means 144Hz is not achievable. 120Hz should be possible, assuming monitor is designed with HDMI in mind.
In all likelihood, DVI-D is what you'd need to use to achieve 144Hz.

As for your adapters, they only (typically) work in one direction. A DVI-VGA adapter is not, by default, a VGA-DVI.

So, do you have a VGA to DVI, or DVI to VGA adapter? If it's merely passive (which it likely is), then it's DVI-I to VGA. DVI-I carries analog signals, so can be adapted to VGA. Not as simple going the other direction.

VGA to HDMI adapters don't exist AFAIK. HDMI to VGA does, but they need to be "active", actively converting the signal. VGA is analog, HDMI is digital. It's not simply a matter of modifying the physical connections, you need to convert the signal. Those converters are not cheap.
 
Solution
Which GPU are you using? What monitor? What adapters?

Do you get 120Hz from the HDMI?

From the info available:
  • you have a monitor, capable of 144Hz that has HDMI and DVI?
  • You have a GPU with VGA and HDMI?
So, your GPU has (at best) HDMI 1.4, which in 99% of implementations means 144Hz is not achievable. 120Hz should be possible, assuming monitor is designed with HDMI in mind.
In all likelihood, DVI-D is what you'd need to use to achieve 144Hz.

As for your adapters, they only (typically) work in one direction. A DVI-VGA adapter is not, by default, a VGA-DVI.

So, do you have a VGA to DVI, or DVI to VGA adapter? If it's merely passive (which it likely is), then it's DVI-I to VGA. DVI-I carries analog signals, so can be adapted to VGA. Not as simple going the other direction.

VGA to HDMI adapters don't exist AFAIK. HDMI to VGA does, but they need to be "active", actively converting the signal. VGA is analog, HDMI is digital. It's not simply a matter of modifying the physical connections, you need to convert the signal. Those converters are not cheap.
Im using 1050ti GPU.

Monitor is not a problem. I tried with 3 others. Benq, Asus and some old ViewSonic.
The result is the same. My GPU wouldnt work when i put on converters. They are both separted. I have 1 VGA/HDMI other is VGA/DVI and i got black screen when i turn my monitor. It only works with ordinary HDMI cable. I tried on my integrated GPU, and converters work just fine since i got picture. Im confused about what is happening because it seems that my GPU plug is obviously working, my converters are working aswell, but i still dont have picture when i combine them.
 
What adapters are you using? Unless the signal is being converted (an "active" adapter), there is no analog signal present on a 10xx GPU.

The iGP still retains analog via DVI, I believe - so that makes sense.
But if you're converting the iGP via HDMI and it works too.... then that is very, very strange.