IMO
The myth that DVI (digital) does not need to be converted to (analog) is just that, a myth. On a analog system, the Ramdac (chip that generates the video signal on the VC) has been integrated into the graphics controller chip for years now. Adding DVI means adding a DVI transmitter chip to the VC and a DVI receiver chip in the monitor.
In order to transmit the digital data from the VC in true digital, the graphics chip must have DVI outputs and the video cable would need to have a single wire for each bit. If this were true the cable would need to contain more than 27 wires. You can imagine how thick this cable would be. DVI converts the parallel data to a number of digital serial channels. Depending on the interface used (DVI-I or DVI-D) the number of serial channels varies. The serial bit steam is then converted back on the monitor side, so you can argue that DVI actually increases the number of times the signal is processed.
Even monitors with a DVI interface convert to analog at the LCD driver level. The digital signal must be converted to an analog in order to achieve the 16M colors. If LCD was pure digital only two colors, black and white would be achievable. In order to generate the 16M colors each red, green and blue cell must be capable of stepping through 256 shades this is an analog function. In fact, most LCDs maintain the video signal in analog form through to the pixel drivers (NEC was the most notable producer of these).
Jim Witkowski
Chief Hardware Engineer
Cornerstone / Monitorsdirect.com
<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>