Question VGA vs DVI-1 VGA

This is probably a silly question, but I am wondering if there is any difference between using a dedicated VGA graphics card vs a DVI-1 video card with a passive VGA adapter. IOW, is a DVI-1 VGA mode less robust than a similar graphics card with only a VGA plug.
 
Thanks for the response. Along the same vein, I am currently stuck with an analog monitor. I've bought a new (to me) video card that has a DVI-1 and DP plug. I've read an active adapter might give a better image over a passive adapter. Is there any truth to this and if so, which socket would have the best image with an active adapter? Thanks..
 
has a DVI-1 and DP plug My computer has onboard Intel HD Graphics 2500. I was looking for a low cost PCIe graphics and saw the Dell Computer name a Dell NVIDIA Geforce GT 640 discrete card. I realize it's an old card, but they can be had used for $25 so I bought one over ebay. When I installed the card and drivers, I immediately noticed the text was not as sharp and I spend most my time staring at the text on my monitor. I did a little research and I found some NVIDIA graphics boards have a reputation for slightly fuzzy text. Now I know a monitor has a lot to do with the quality of the text, but I'm only comparing the sharpness of the text with the onboard graphics, vs the NVIDIA. At this point I'm looking at another legacy card, the Dell AMD Radeon R5 240 1G with DVI-1 and DP plugs. It's a cheap card but has Windows 10 drivers (from AMD), I just hope the text will be as clear as the onboard graphics.
 
It's DVI-I (letter i), not DVI-1 (one).

The extra pins on the DVI-I port carry the exact same analog video signals as VGA. So with a passive VGA adapter it'll be equivalent to using VGA. The adapter just rewires the extra DVI-I pins to the correct pins on a VGA plug. The whole point of the connector was to save space on the end of the video card, by allowing manufacturers to omit the little-used VGA port while retaining the ability to output a VGA signal.

I was looking for a low cost PCIe graphics and saw the Dell Computer name a Dell NVIDIA Geforce GT 640 discrete card. I realize it's an old card, but they can be had used for $25 so I bought one over ebay. When I installed the card and drivers, I immediately noticed the text was not as sharp and I spend most my time staring at the text on my monitor. I did a little research and I found some NVIDIA graphics boards have a reputation for slightly fuzzy text. Now I know a monitor has a lot to do with the quality of the text, but I'm only comparing the sharpness of the text with the onboard graphics, vs the NVIDIA.
Welcome to the world of analog video. The quality of the DAC (digital to analog converter) in the video card could make a difference in the quality of the output image. It's not uncommon for a cheap card (or cable) to result in a degraded image. Blurry text, wavy image, bright/dark lines, etc.

First thing to try is to use the "auto-adjust" option on your monitor. This synchronizes the monitor's scan rate timings with the VGA signal's scan rate. So the left and right edges, and top and bottom edges line up, and each analog pixel matches up with a monitor pixel.

If that fails and the video card also has digital output (DVI-D, HDMI, or Displayport), this can usually be remedied with an active VGA adapter. Those have their own DAC to convert the digital video signal into analog, allowing you to bypass the video card's DAC. Hard to believe that a card manufacturer would skimp on a part that cost less than $1 even more than a decade ago, but I've run across a few cheap analog video cards with the problem you describe.
 
Last edited:
Wow, lots of good advice. I see that the active VGA adapters do the conversion from DVI-D 24+1, that apparently is already done by the DVI 24+5. I would assume a DVI 24+5 would do as good a conversion to VGA, as an active 24+1 to VGA adapter, so I guess I'm left with trial and error with video cards until I find one with average sharp text. I did not see any difference with either card as far as the screen alignment.
 
The conversion from the digital picture to an analog picture is done by a DAC. Your blurry text is because Nvidia cheaped out and put a poor-quality DAC in the video card. An active VGA adapter will have its own DAC, and hopefully it's better quality so will result in clearer text.

You're not going to notice any alignment differences by eye. The monitor's auto-adjust function is making alignment changes as small as a fraction of a pixel. If you don't use it, the signal for an analog pixel can "spill over" into adjacent monitor pixels, resulting in blurry text. Overall the image will look the same - you might see a tiny bit more desktop along some of the edges. But text and graphics will become much sharper. Especially if you've got ClearType on (Windows' subpixel rendering).