Just FYI:
In the past we had VGA, then DVI came along with flat-screen panels which were controlled digitally not via analog signals that were used to adjust the beam strength and position on CRT monitors.
When we began to transition we had both VGA and DVI-D separately on the same cards.
Then, it became convenient to put BOTH SETS of pins on the same output. This was called DVI-I (integrated digital and analog). This avoided CONFUSION because you can only connect via DVI or VGA at one time. When VGA taps off DVI it still shows the same signal as DVI (just converted to analog) so you can't use both at the same time anyway so DVI-I takes less space and avoids this confusion.
(or just terminate the DVI signal so there's no output and use a different DVI path with no DAC tapped for VGA but this also adds to the complexity and thus cost of the card)
The analog, VGA signal was always created by a DAC (Digital to Analog Converter) whereas the digital signals don't need a DAC they just come straight from the GPU.
When you get a "VGA adapter" (passive) it connects only to the VGA pins coming from the DAC.
Now, very few monitors have ONLY a VGA input so expect the cards to disappear that support VGA. Active adapters with DACs will hang around for a few years then probably disappear too.