holdingholder :
DVI presents a cleaner image than VGA, but it does depend on other factors. VGA is susceptible to interference from other electronics and is an analog signal which can degrade over the length of longer cables. DVI is limited to 60 frames per second, but who can notice that? In my personal experience DVI gives me a cleaner more vibrant image.
More misinformation....
DVI is not limited to 60Hz. DVI is not immune from interference. It's just that the way it's deisgned, it resists more. But once interference is too high, monitor will drop the signal entirely, whereas VGA may still be usable (probably not for text).
One of the problems with VGA on matrix (LCD) monitors is that during a line, the pixels may not be aligned the same (start to finish of line). So some of the pixels could be averages between 2 pixels. This is not a problem for games/videos, but it is for text. This is because VGA uses only 1 line-sincronization signal.
But for DVI, the pixel values come as digital numbers for each pixel, all (bits in this case) synced to a clock signal. So DVI give sharper image, less depended of monitor electronics.
Also, VGA signals have a common ground, while DVI signals are differential pairs, so again a better interference protection.
Update:
What cables cannot do is improve performance or rendering quality. Those 2 will output from GPU identical, wether VGA, DVI, HDMI, displayport. VGA vs. digital affects only pixel quality (like I said: alignment of intended vs. sampled).
PS: If you use DVI-to-VGA cables, it's still VGA signals that are passed. You need DVI connectors on both equipments. DVI is a larger connector, with 3 rows of pins on 2/3 of the connectors length and some kind of + with some more pins on the remaining part.