[SOLVED] DVI-D to VGA adapter not working on GPU

Aug 8, 2020
4
0
10
Hey everyone. Please forgive me for my grammar, English isn't may native tongue.

So recently, I bought an RX580 GPU for my Ryzen 3 2200g with integrated gpu. And I connected the display to my dedicated GPU via VGA cable and a DVI-D to VGA adapter, but i'm not getting any signal on my monitor. So I tried enabling the on board graphics card on BIOS, with the same cable connected to the DVI-D port of the GPU, and it's working, which is confusing me as to why.

So connecting the DVI-D to the dedicated graphics card while enabling on board graphics card on bios works, but when I turn this off, it gives me the "no signal" error again.

Anyone knows why is this? And how do I make it work using my dedicated GPU?

Hoping anyone can help me. Thank you!
 
Solution
I don't mean to sound like i'm countering your explanation, but I just want to know why does it work when I'm using on board graphics with the same GPU port? It it as DSzymborski said, that it should have an active process for it to work? So is my Mobo doing that process but the GPU can't? I just want to understand, thank you!

What kanewolf said.

Someone using integrated graphics is more likely to be using an ancient VGA monitor than someone with an RX 580, so the motherboard manufacturer has a lot more motivation to make sure it works.

Of course, there could just be something wrong with the GPU itself, but you won't actually know unless you have an adapter that's guaranteed to actually work. You don't have that.

kanewolf

Titan
Moderator
Hey everyone. Please forgive me for my grammar, English isn't may native tongue.

So recently, I bought an RX580 GPU for my Ryzen 3 2200g with integrated gpu. And I connected the display to my dedicated GPU via VGA cable and a DVI-D to VGA adapter, but i'm not getting any signal on my monitor. So I tried enabling the on board graphics card on BIOS, with the same cable connected to the DVI-D port of the GPU, and it's working, which is confusing me as to why.

So connecting the DVI-D to the dedicated graphics card while enabling on board graphics card on bios works, but when I turn this off, it gives me the "no signal" error again.

Anyone knows why is this? And how do I make it work using my dedicated GPU?

Hoping anyone can help me. Thank you!
Please post a link to the adapter you are using and the motherboard you have.
 
  • Like
Reactions: KitchMint

DSzymborski

Titan
Moderator
Hey everyone. Please forgive me for my grammar, English isn't may native tongue.

So recently, I bought an RX580 GPU for my Ryzen 3 2200g with integrated gpu. And I connected the display to my dedicated GPU via VGA cable and a DVI-D to VGA adapter, but i'm not getting any signal on my monitor. So I tried enabling the on board graphics card on BIOS, with the same cable connected to the DVI-D port of the GPU, and it's working, which is confusing me as to why.

So connecting the DVI-D to the dedicated graphics card while enabling on board graphics card on bios works, but when I turn this off, it gives me the "no signal" error again.

Anyone knows why is this? And how do I make it work using my dedicated GPU?

Hoping anyone can help me. Thank you!

It works when you connect to the motherboard because your GPU has integrated graphics. When it's connect to the motherboard, you're not using your RX 580.

DVI-D is digital only. DVI-I also contains an analog signal, but that has long since been phased out of general use. VGA is an analog signal, which means that there has to be an active process to convert DVI-D to VGA inside the adapter. What this means is that your adapter has to be an active one as a passive one is not enough. And unfortunately, lots of cheap junk just calls itself active without actually being active.
 
  • Like
Reactions: KitchMint

DSzymborski

Titan
Moderator
Hey, thanks for replying!

Link to Adapter:
https://www.lazada.com.ph/products/...ll&laz_token=0fa0e7c693851d937a98f164b08b6a1c

I'm using Gigabyte a320m-s2h

At that price ($4 US), it's unlikely to actually be an active adapter that works.

Something like this is more likely to be legitimate, though it's hard to say for sure as we have none of these brands in the United States. Hopefully someone with experience finding an active adapter sold in the Philippines will chime in at some point.

https://www.lazada.com.ph/products/...4l.searchlist.list.49.1b593ce1L6ebjW&search=1
 
  • Like
Reactions: KitchMint

kanewolf

Titan
Moderator
Hey, thanks for replying!

Link to Adapter:
https://www.lazada.com.ph/products/...ll&laz_token=0fa0e7c693851d937a98f164b08b6a1c

I'm using Gigabyte a320m-s2h
What I don't see on that adapter is a separate power input. To convert from digital to analog you have to have an active (powered) adapter. The DVI spec has a +5vdc pin -- https://en.wikipedia.org/wiki/Digital_Visual_Interface#/media/File:DVI_pinout.svg
But the graphics card may not supply sufficient power to the adapter. Quality adapters have a separate USB power adapter like the one that @DSzymborski linked.
 
  • Like
Reactions: KitchMint
Aug 8, 2020
4
0
10
What I don't see on that adapter is a separate power input. To convert from digital to analog you have to have an active (powered) adapter. The DVI spec has a +5vdc pin -- https://en.wikipedia.org/wiki/Digital_Visual_Interface#/media/File:DVI_pinout.svg
But the graphics card may not supply sufficient power to the adapter. Quality adapters have a separate USB power adapter like the one that @DSzymborski linked.


I don't mean to sound like i'm countering your explanation, but I just want to know why does it work when I'm using on board graphics with the same GPU port? It it as DSzymborski said, that it should have an active process for it to work? So is my Mobo doing that process but the GPU can't? I just want to understand, thank you!
 

kanewolf

Titan
Moderator
I don't mean to sound like i'm countering your explanation, but I just want to know why does it work when I'm using on board graphics with the same GPU port? It it as DSzymborski said, that it should have an active process for it to work? So is my Mobo doing that process but the GPU can't? I just want to understand, thank you!
What I am saying is that the motherboard may be providing ENOUGH +5vdc on the internal pins to power the converter. The GPU may not be.
 
  • Like
Reactions: KitchMint
Aug 8, 2020
4
0
10
What I am saying is that the motherboard may be providing ENOUGH +5vdc on the internal pins to power the converter. The GPU may not be.

So that's why. Thanks for the extra bit of knowledge about this. I guess it's time to buy a different cable then! Thank you guys for your time! Cheers!
 

DSzymborski

Titan
Moderator
I don't mean to sound like i'm countering your explanation, but I just want to know why does it work when I'm using on board graphics with the same GPU port? It it as DSzymborski said, that it should have an active process for it to work? So is my Mobo doing that process but the GPU can't? I just want to understand, thank you!

What kanewolf said.

Someone using integrated graphics is more likely to be using an ancient VGA monitor than someone with an RX 580, so the motherboard manufacturer has a lot more motivation to make sure it works.

Of course, there could just be something wrong with the GPU itself, but you won't actually know unless you have an adapter that's guaranteed to actually work. You don't have that.
 
Solution