With the overscan/under corrected in your video card drivers, most screens will look as good as they will.
In the rare cases of quality loss it would be very noticeable and you would want to rip your eyes out.
I would take a good guess you are fine with adjusted HDMI.
Now onto the DVI - > HDMI question. It depends on the monitor and video cards setup. I actually use a DVI -> HDMI cable because my video card has a mini hdmi connector that seems like it would not be strong. My video card still knows it is a HDMI device and this offers overscan options. A native DVI screen does not have nor need these options.
The option only exists because most TV's overscan to remove any garbage from the edges of the video(this in it self should not be needed anymore since digital streams do not seem to have this issue.).
With a quality cable and ADC in the monitor VGA can look very close to DVI/HDMI/DP, but text may still be slightly less sharp. Since part of the quality lies within the monitor all screens may be a bit different.
My Syncmaster 245T is on of those screens that HDMI looks awful on(because it expects HDMI to only be used for TV and thus has awful overscan[fine for TV i guess] not found on 99% of computer screens.). In this case VGA is MUCH better looking than HDMI, but VGA is just a tiny bit sharper on text. Games/videos would be hard to see any difference because it is that close.
In the end one of the biggest down sides to VGA is a total lack of copy protection. This means bluray and some other high definition content may not work at full resolution or at all in some cases. Any video without this protection requirement will work fine.
Add this to relying on the screens analog to digital converter quality makes recommending VGA hard these days(not that it is actually that bad on the right hardware).
EDIT.
Saw another post has been made
hjj1746, VGA can support over 1920 x 1080(I think it maxes at 2048x1536[not bad for such an old standard], but you need one hell of a cable card and screen). HDMI requires later versions that use a higher clock rate to allow passing 1920 x 1200 without reducing the refresh rate. With 1080p(thanks for the naming convention Hollywood. Computer screens have been progressive since the 80s[or earlier] after all the p is kind of redundant) being the "standard" I do not think it will matter too much in this case.
Still agree that digital is the way to go.