question for jim the monitor guy

Tommunist

Distinguished
Jun 14, 2002
413
0
18,780
i'm pretty sure i have heard of "digital" monitors; ie - the signal going to the monitor is digital and is then made analog at the monitor as oposed to an analog being sent to the monitor. is there an advantage to having a "digital" monitor?

It's always darkest just before it goes pitch black.
 
Absolutely. Noise picked up in the cable, and signal degredation from the PC to the monitor has much less or no influence if the signal is digital.


<i><b>"I don't understand what it is! Let me kill it!" -- Worf</b></i>
 
is there a noticeable difference? I have an analog monitor and have never noticed anything that is "wrong" with the picture I'm looking at. I understand what you are saying I think; is the digital signal similar to on a network where you have packets with CRCs on them?

It's always darkest just before it goes pitch black.
 
Well, Im not sure. But the advantage is that an important component for the image quality is in the monitor and hence under the monitor vendors control. With a digital monitor, even a cheap graphics gard can display a crisp clear image.
I dont know what signaling is used or how the physical layer is like.

<i><b>"I don't understand what it is! Let me kill it!" -- Worf</b></i>
<A HREF="http://www.btvillarin.com/phpBB/viewtopic.php?p=6062#6062" target="_new">My System</A>
 
Yes. The information is transmitted digitally from the GFX card to the monitor.

<i><b>"I don't understand what it is! Let me kill it!" -- Worf</b></i>
<A HREF="http://www.btvillarin.com/phpBB/viewtopic.php?p=6062#6062" target="_new">My System</A>
 
DVI (digital) on a CRT simply does not make sense.

Almost all VGA video cables these days are shielded so any outside noise influence is negligible and for the most part no longer much of an issue.

The limitation per the DVI standard is a 162 MHz pixel clock, That's only high enough to support 1600 x 1200 at 60 Hz or 1280 x 1024 at 85Hz (162 MHz happens to be the VESA- standard clock for that timing for CRTs).

However not all DVI implementations can support the full 162 MHz clock,

The pixel clock limits the maximum resolution and refresh rate of the monitor. DVI would work fine for the lower resolution 17 and 19 inch CRT monitors, that are only capable of 1280 x 1024 at 85Hz. The current implementation of DVI cannot support the larger 21” or higher end 19” that run 1280 x 1024 at 100Hz or 1600 x 1200 at 85Hz.

Higher speed DVI chips are in development, however to upgrade you will need to replace both the monitor and video card.

On a analog system, the Ramdac (chip that generates the video signal on the VC) has been integrated into the graphics controller chip for years now. Adding a RAMDAC to a monitor is added cost where the benefit is so low it simply makes no sense from an economic point of view. The benefits of better signal quality are so small that in practice you could not tell the difference.

NEC, ViewSonic attempted to put DVI on some models and quickly dropped the idea based on many of these issues.

Jim Witkowski
Chief Hardware Engineer
MonitorsDirect.com

<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>