DVI and ANALOG? REALLY ANY VISIBLE DIFFERENCE?

nanyangview

Distinguished
Apr 17, 2002
38
0
18,530
Ok, this question is about DVI and analog 17 incher LCD monitors! Are there really any HIGHLY noticable image quality difference between a DVI and ANALOG monitor?
Has anyone done a test before and noticed anything. Like a comparison between a ANALOG only LCD to a DVI&ANALOG HYBRID. Thanks!
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
Guess we need to define “HIGHLY”. What may be a big difference to me may not be big to you. Will you see a difference? Yes! Will it be highly different, maybe not. There are many things that come into play such as the video quality of the video card sending the analog signal. Some are better than others. Personally I see what I would describe as a small difference and I have set up many of these experiments using a focus group of over twenty people. I did a blind test and used a dual head Matrox video card with an analog and DVI-I connected to the same monitor (it has both analog and DVI inputs). 18 out of 20 could not perceive a difference. The two that saw a difference noted it as small.

Jim Witkowski
Chief Hardware Engineer
Cornerstone / Monitorsdirect.com

<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 

imo

Distinguished
Feb 2, 2002
109
0
18,680
Have you try using Nvidia chipset graphic card on your blind test? since quite a few think that Nvidia analog output is not anywhere the quality of Matrox, and Nvidia is by far more popular then Matrox! maybe it will be more telling?
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
I’m well aware of the issues with video cards that use the Nvidia chip sets. This is specifically why I used a Matrox card in my testing. Most people would not understand that the video signal could be the problem and blame the poor video quality on the monitor when in fact the video card is outputting a poor signal. If you are using a card with an Nvidia chip set odds are you like to play games. In that case I would recommend a CRT monitor not and LCD.

Jim Witkowski
Chief Hardware Engineer
Cornerstone / Monitorsdirect.com

<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 

AMD_Man

Splendid
Jul 3, 2001
7,376
2
25,780
Agreed. If you have a good RAMDAC and filters on your graphics card, you won't notice the difference between analog and digtal output.

:wink: <b><i>"A penny saved is a penny earned!"</i></b> :wink:
 

imo

Distinguished
Feb 2, 2002
109
0
18,680
agree,
but then it will rule out just about every pre 2002 Nvidia based graphic card!

Matrox might have great Ramdac/filters, but its 3d performance is very much subpar compare to even last year Nvidia/Ati chipset!

why have a extra layer of conversion if you can just go straight digital??
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
IMO

The myth that DVI (digital) does not need to be converted to (analog) is just that, a myth. On a analog system, the Ramdac (chip that generates the video signal on the VC) has been integrated into the graphics controller chip for years now. Adding DVI means adding a DVI transmitter chip to the VC and a DVI receiver chip in the monitor.

In order to transmit the digital data from the VC in true digital, the graphics chip must have DVI outputs and the video cable would need to have a single wire for each bit. If this were true the cable would need to contain more than 27 wires. You can imagine how thick this cable would be. DVI converts the parallel data to a number of digital serial channels. Depending on the interface used (DVI-I or DVI-D) the number of serial channels varies. The serial bit steam is then converted back on the monitor side, so you can argue that DVI actually increases the number of times the signal is processed.

Even monitors with a DVI interface convert to analog at the LCD driver level. The digital signal must be converted to an analog in order to achieve the 16M colors. If LCD was pure digital only two colors, black and white would be achievable. In order to generate the 16M colors each red, green and blue cell must be capable of stepping through 256 shades this is an analog function. In fact, most LCDs maintain the video signal in analog form through to the pixel drivers (NEC was the most notable producer of these).

Jim Witkowski
Chief Hardware Engineer
Cornerstone / Monitorsdirect.com

<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
Flame

Define better! What looks better to you may not look better to me, everyone’s eyes are different. It is like art, there is no right or wrong, good or bad, it’s just your opinion and everyone has one and we have heard yours on many occasions.

Jim Witkowski
Chief Hardware Engineer
Cornerstone / Monitorsdirect.com



<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 

AMD_Man

Splendid
Jul 3, 2001
7,376
2
25,780
Even monitors with a DVI interface convert to analog at the LCD driver level. The digital signal must be converted to an analog in order to achieve the 16M colors. If LCD was pure digital only two colors, black and white would be achievable.
You don't have to store each pixel in just one bit. 2^24=16.7M so you could use 24bits to represent each pixel.

:wink: <b><i>"A penny saved is a penny earned!"</i></b> :wink:
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
Are you talking about the video card memory. I’m talking about how the LCD drivers must twist the crystals in distinct 256 steps, this has nothing to do with how many bits represent a pixel. This is a physical twisting of the LCD molecules. Think of it like a camera shutter. If it is closed it does not let light in, open full lets in the maximum amount of light. Shutter half open lets in half the maximum amount of light. In a LCD there are 256 shutter openings for each sub-pixel. Three sub pixels = one pixel 256 x 256 x 256 = 16.7 million colors.

Jim Witkowski
Chief Hardware Engineer
Cornerstone / Monitorsdirect.com



<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
<P ID="edit"><FONT SIZE=-1><EM>Edited by GoSharks on 04/29/02 02:51 PM.</EM></FONT></P>
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
The type of interface used on a monitor, has nothing to do with how bright or vibrant the colors are or the contrast of the image shown on the screen.

What is vibrant to you may not be vibrant to me.

Jim Witkowski
Chief Hardware Engineer
Cornerstone / Monitorsdirect.com


<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 

imo

Distinguished
Feb 2, 2002
109
0
18,680
What is vibrant to you may not be vibrant to me???

that might be true in exceptional case! but it is my guess that if you do a blind test of 1000 person between 2 different monitor with suppose same color temp, you will have a easy to define group of perference at which monitor might have more vibrant color?

GoSharks, are you trying to tell us that given a choice of Dvi and vga interface, there is absolutely no difference between the purity of video signal output between the two interface no matter which or what graphic card you use, and that there is in fact case where vga output is even more "pure" as in error free then the dvi of the same?

Are you in fact telling us the DVI or digital interface is just pure Voodoo"marketing" and have absolutely no positive aspect compare to our good old industry standard vga analog?
 

hammerhead

Distinguished
Mar 5, 2001
531
0
18,980
Obviously I can't speak for Gosharks but it seems to me that he is merely debunking some of the myths regarding DVI's supposed inherent superiority.

Most panels I have tested looked just as good in analog mode as digital. Some were superior in DVI but as Gosharks explained, it is often down to the Vid card.

I think this goes some way to explaining why many high-end panels still don't bother with DVI.

Yes, some panels are considerably worse with an analog interface (Viewsonic, I'm looking at you) but that's down to the panel having a crappy analog interface.
 

Oni

Distinguished
Dec 31, 2007
880
0
18,980
On my TFT7020 and a Geforce 4 Ti4600 I notice a huge difference between analog and digital connections. DVI has more vivid colors. I can easily tell the difference just by hooking up both digital and analog and switching the connectors. Also having the LCD next to a CRT I can greatly tell color differences, my CRT seems washed out color wise while the LCD has excelent contrast and vividity (is that a word?)
That could be that my CRT is old (Hitachi SuperScan Elite 751)

Gosh I'm such a nerd sometimes, but then again arn't we all. :smile:
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
Hammerhead

Thanks, you are correct. There is so much marketing hype that people take as gospel truth. Personally I think its funny how the marketing guys pull the wool over so many people.

Does DVI provide a better signal to the monitor absolutely yes. Can many people tell the difference? From my focus group experiments the answer is no. Personally I can see a difference (remember it is my job to know what to look for) and in some cases I agree that the units with DVI have a clearer image. However I have seen units with an analog interface that look better than units with a DVI interface.

Do people see colors differently? Yes! Better / vibrant are subjective terms that cannot be quantified.

Are LCD monitors with a DVI interface analog or digital? Yes and Yes. Digital up to the LCD drivers then they are analog. Personally I can see a difference

Does DVI have pitfalls or is it the perfect interface. Yes and No. DVI is rather limited in its ability to support higher resolutions and refresh rates. If you want to upgrade your monitor or video card at a later date, unlike units with an analog interface you will need to upgrade both at the same time. Nothing is perfect.

Jim Witkowski
Chief Hardware Engineer
Cornerstone / Monitorsdirect.com

<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
Early LCD panels had what I think is called "pixel jitter" when the VGA interface is used. Has this been eliminated or at least successfully controlled in current panels?

TIA

<b>I have so many cookies I now have a FAT problem!</b>
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
The DVI interface is essentially a multi channel serial bit stream, similar to the analog (single channel) bit stream. Both need to be sampled at a data rate to sync up with the corresponding row and column drivers. This sampling is a common cause of pixel jitter. Modern monitors are far less susceptible to pixel jitter than those made even a year ago.

Jim Witkowski
Chief Hardware Engineer
Cornerstone / Monitorsdirect.com

<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
I thought the analog interface was more susceptible to pixel jitter. My mistake.

I'm seeing many "bargain" panels. I'm assuming many of these panels are older models. Would there be any reason to choose one with a DVI interface over one with just an analog interface? Did your panel testing results apply to old models?

Thanks again, Jim.

<b>I have so many cookies I now have a FAT problem!</b>