VGA Monitor with 1050 Ti?

LostEnggSoul

Prominent
Jun 24, 2017
29
0
530
So I built a Ryzen 5 system yesterday with 1050 Ti and just realized that my monitor is VGA.

Will a VGA HDMI converter work? Will there be any problems that I'll experience using it this way?
 
Solution
Just FYI:
In the past we had VGA, then DVI came along with flat-screen panels which were controlled digitally not via analog signals that were used to adjust the beam strength and position on CRT monitors.

When we began to transition we had both VGA and DVI-D separately on the same cards.

Then, it became convenient to put BOTH SETS of pins on the same output. This was called DVI-I (integrated digital and analog). This avoided CONFUSION because you can only connect via DVI or VGA at one time. When VGA taps off DVI it still shows the same signal as DVI (just converted to analog) so you can't use both at the same time anyway so DVI-I takes less space and avoids this confusion.

(or just terminate the DVI signal so there's no output and...

Kenton82

Reputable
POWER can be drawn from the port. No need for USB.

https://www.monoprice.com/product?p_id=5135

It's not in stock for a few weeks, but it's a good example. There seems to be a HUGE QUALITY DIFFERENCE with many not working.

*Be very careful it's:
a) the right type, and
b) quality is good (see customer feedback)

You can probably find HDMI, DP and DVI active adapters.

The cheaper ones support up to 1920x1200@60Hz. The higher resolutions are so expensive you might as well just buy a different monitor.

Looking for HDMI and DVI-D adapters and will post below.
 
Just FYI:
In the past we had VGA, then DVI came along with flat-screen panels which were controlled digitally not via analog signals that were used to adjust the beam strength and position on CRT monitors.

When we began to transition we had both VGA and DVI-D separately on the same cards.

Then, it became convenient to put BOTH SETS of pins on the same output. This was called DVI-I (integrated digital and analog). This avoided CONFUSION because you can only connect via DVI or VGA at one time. When VGA taps off DVI it still shows the same signal as DVI (just converted to analog) so you can't use both at the same time anyway so DVI-I takes less space and avoids this confusion.

(or just terminate the DVI signal so there's no output and use a different DVI path with no DAC tapped for VGA but this also adds to the complexity and thus cost of the card)

The analog, VGA signal was always created by a DAC (Digital to Analog Converter) whereas the digital signals don't need a DAC they just come straight from the GPU.

When you get a "VGA adapter" (passive) it connects only to the VGA pins coming from the DAC.

Now, very few monitors have ONLY a VGA input so expect the cards to disappear that support VGA. Active adapters with DACs will hang around for a few years then probably disappear too.
 
Solution

Kenton82

Reputable


Very informative! :vip: