Question Old monitor doesn't have DVI-D support, or does it ?

Jun 5, 2025
7
0
10
Sorry but I need quick help :/

So basically I have Viewsonic Ve500 monitor that unfortunately only supports VGA.
My only graphics card since I have laid hands on my PC has been a Quadro K600 which fortunately had
DVI-I support so I just had to use a converter for my VGA cable to connect to the GPU.

Now unfortunately I have ordered an RX 550 GPU, its old I know. Anyway, the RX 550 it seems only supports HDMI and DVI-D. So what are my best options?

Should I cancel my order, or should I just get a converter for my monitor and hopefully it will be able to comprehend analog signals through the VGA converter?

Or should I get either a VGA to DVI or a DVI to VGA adapter ?

Please help 🙁
 
Monitor: https://www.amazon.ca/ViewSonic-VE500-2-Monitor-Silver-Black/dp/B0000BVUYZ
GPU:
https://techarc.pk/sapphire-pulse-r...m8Zj3s5Yh5eVA1fZU3mwuZVySjSB5XPECy1W4T3Dgai5e

The converter I used to use:
DVI_to_VGA_Converter_1__in_Pakistan.webp


And I also wonder would my Monitor be able to understand analog signals at all? Even if I use a hdmi cable and connect it to monitor by vga converter at monitor's end.
Or should I just get this, and would it work?
images

Thanks <3
 
Your best option is to buy a new monitor, as even RX 550 would be wasted on 1024x768.

Any money spent towards an active DVI-D to VGA converter would be much better spent towards a more modern monitor, as those things don't exactly have good image quality or reliability, plus more modern GPUs don't even have DVI so it wouldn't be useful in the future.
 
Your best option is to buy a new monitor, as even RX 550 would be wasted on 1024x768.

Any money spent towards an active DVI-D to VGA converter would be much better spent towards a more modern monitor, as those things don't exactly have good image quality or reliability, plus more modern GPUs don't even have DVI.
Yes but monitors are really pricey here. And I don't mind 1024x768 display, its really crystal clear. And about the convertors there are many available here under like for dollar, are they fake?
These are under dollar:
images
 
If they don't have a power input (which can be supplied with a USB cable or a power brick), then they are probably fake.

Pin 14 of DVI can supply only 55mA 5v which is just over 1/4 watt. Unless they put the world's most efficient active conversion circuitry in there to sell for a dollar, or try to pull more than the rated power out of the card (which could damage it as this power is only intended for reading the monitor's EDID), it is very unlikely to work.

Back when video cards only sold for $300, nVidia cards had terrible analog image quality, ATI was OK, and Matrox was excellent. And that was the analog circuitry in cards that sold for hundreds of dollars. How good can the image quality of something designed to be built for 50 cents so it can be sold for a dollar be?
 
That man tested the one most likely to work without a power supply, because per the spec, DisplayPort can supply up to 1.65w which is 6x as much as DVI can. Many brands of adapters do have a power jack on the side of the box opposite the VGA port, and will tell you to order a cable or power supply if it doesn't work--but those cost more than a dollar.

750Ti is faster than RX 550 and uses 10w more. Be aware though that some overclocked models do require a 6-pin PCIe power connector
 
  • Like
Reactions: Krisoft
so can't there be adapters that work without power supply? like the man has, if there is no need of power supply why are there ones requiring power supply?
Also the thing i wonder is that whether my monitor understand analog signals? even though it only has single vga plug
 
The good adapters supply a power jack in case not having one doesn't work. The cheap adapters may well sometimes work by pulling more power from the card than the spec allows and if it doesn't work on your particular card, well you're only out a dollar and they saved 10 cents by not putting in the jack + still have your money.

VGA is analog only, but the LCD panel inside is digital-only. This means your old monitor has pretty much the opposite of one of these adapters inside of it, though a high-quality one as it has lasted 22 years so far.
 
It is a digital display, but can only communicate with the outside world using an analog converter wired to the VGA port. At the time your monitor was new, the only monitors that had DVI were premium models such as Apple's Cinema Display, or CRTs such as IBM's P260 (which were of course inherently analog so used DVI-A). So it made sense for low-end LCDs to follow the standard of the time which was analog VGA.

Similarly, all sound cards are digital but the output standard has long been analog headphone jacks, with digital S/PDIF outputs through TOSLINK or digital coaxial being a rare feature found mostly in higher-end sound cards intended for connection to a stereo receiver that can decode those. If you don't have digital ports, then you can't get digital sound out of it.

HDMI didn't come out until 2003 + DisplayPort 2006, so the standard digital port of the time was DVI-I/DVI-D and continued to be for some time afterwards. What finally killed DVI was 4k resolution becoming common, as not even dual-link could support this at 60Hz.
 
  • Like
Reactions: Krisoft
It is a digital display, but can only communicate with the outside world using an analog converter wired to the VGA port. At the time your monitor was new, the only monitors that had DVI were premium models such as Apple's Cinema Display, or CRTs such as IBM's P260 (which were of course inherently analog so used DVI-A). So it made sense for low-end LCDs to follow the standard of the time which was analog VGA.

Similarly, all sound cards are digital but the output standard has long been analog headphone jacks, with digital S/PDIF outputs through TOSLINK or digital coaxial being a rare feature found mostly in higher-end sound cards intended for connection to a stereo receiver that can decode those. If you don't have digital ports, then you can't get digital sound out of it.

HDMI didn't come out until 2003 + DisplayPort 2006, so the standard digital port of the time was DVI-I/DVI-D and continued to be for some time afterwards. What finally killed DVI was 4k resolution becoming common, as not even dual-link could support this at 60Hz.
thank you - I just canceled my order. Hopefully get a GTX 750 Ti for my monitor instead <3