DVI (digital) on a CRT simply does not make sense.
Almost all VGA video cables these days are shielded so any outside noise influence is negligible and for the most part no longer much of an issue.
The limitation per the DVI standard is a 162 MHz pixel clock, That's only high enough to support 1600 x 1200 at 60 Hz or 1280 x 1024 at 85Hz (162 MHz happens to be the VESA- standard clock for that timing for CRTs).
However not all DVI implementations can support the full 162 MHz clock,
The pixel clock limits the maximum resolution and refresh rate of the monitor. DVI would work fine for the lower resolution 17 and 19 inch CRT monitors, that are only capable of 1280 x 1024 at 85Hz. The current implementation of DVI cannot support the larger 21” or higher end 19” that run 1280 x 1024 at 100Hz or 1600 x 1200 at 85Hz.
Higher speed DVI chips are in development, however to upgrade you will need to replace both the monitor and video card.
On a analog system, the Ramdac (chip that generates the video signal on the VC) has been integrated into the graphics controller chip for years now. Adding a RAMDAC to a monitor is added cost where the benefit is so low it simply makes no sense from an economic point of view. The benefits of better signal quality are so small that in practice you could not tell the difference.
NEC, ViewSonic attempted to put DVI on some models and quickly dropped the idea based on many of these issues.
Jim Witkowski
Chief Hardware Engineer
MonitorsDirect.com
<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>