Why so many monitors without DVI

nikolajhendel

Distinguished
Nov 8, 2001
20
0
18,510
It seems that most low budget TFT monitors comes with analog input only. Is it so expensive to include a DVI interface - since the selling point could be that it has dual input - something that ranks high on my list.
Goshark - u seem to know somewhat of these things ;) - so how much more expensive would your Cornerstone f1200 be if it included DVI interface ?

Nick
 

flamethrower205

Illustrious
Jun 26, 2001
13,105
0
40,780
That depends what you're looking at- in teh 15" world, the Compaq TFT5030, and Eizo L365 are low budget LCD's (and also the best), and include a DVI and analog interface. In actuality, if a monitor was only DVI, it's be much cheaper, but the reson there are only hybrids or analog only is b/c many people do not have DVI interface on their vid card. Naturally, it costs more to have DVI and analog.

NOS and a <font color=red> Ferrari </font color=red> can be fun! :cool:
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
Nick

The average home user / gamer always wants the latest and greatest. In this case DVI is the new technology so everyone assumes it must be better. However the industry looks beyond a single market when they design products. I’ll use my new f1200 product as an example. I specifically did not include DVI for a number of reasons.

Technical – DVI Chips these days are slow. They are fine for LCD’s running up to 1280 x 1024. Soon the 1600 x 1200 displays will flood the market and the current generation of DVI chips will not be capable of the speeds required. Thus you will be required to upgrade both your video card and monitor.

Market – This depends on the companies focus. Cornerstone sells primarily to fortune 500 corporations who purchase 20 – 500 units at a time. The IT managers at these companies have strict budgets and they do not upgrade as often as a home user. They must amortize the cost over 3-5 years depending on how their company does accounting. For example some of my customers are still using OS2 and Windows 3.1 or Window 95. These companies require backward compatibility, thus they only require an analog interface. If I added DVI the majority of my customers would not use it and it would simply add cost to the product.

Performance – The studies I have done show very little improvement in video quality between the DVI and analog interface. The magazine reviewers like CNET and Myelabs tend to agree.
Primary reason is the panel itself is an analog device. At the driver level the control of the cell must step through at least 256 levels, this cannot be done with a in pure digital fashion. Even the DVI interface converts the signal from digital parallel to a serial bit stream across the video cable.

Also until very recently the DVI standard was in a state of flux. Multiple companies, revisions / versions, some capable some not. I tend to be conservative and like to let the technology stabilize before incorporating it. Think of all those people who purchased Beta-Max video recorders. Arguably Beta-Max was a better technology, however VHS became the standard. Pick the wrong one and we go out of business. The Analog interface is safe and proven technology and is supported by virtually every video card in the world today .

Jim Witkowski
Chief Hardware Engineer
Cornerstone / Monitorsdirect.com


Jim at http://www.monitorsdirect.com
 

lykele

Distinguished
Aug 26, 2001
12
0
18,510
GoSharks, I'm glad this question was asked and glad you responded since I was thinking of asking the same thing.

If I understand you correctly, what you are saying is if I had a monitor capable of analog and DVI .... and ... if I have a video card capable of analog and DVI, I probably would not notice any appreciable difference regardless of the hookup? Since I am well beyond the age of wanting to look "cool" and I'm not a big "gamer" then I might as well stick with my old 17 inch CRT I've had for 2 years?

I think my confusion comes about in seeing the Flat Panel as a digital piece of hardware which God put on this earth to run via a digital interface. Probably getting it confused with digital video cameras and HDTV.
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
"if I have a video card capable of analog and DVI, I probably would not notice any appreciable difference regardless of the hookup?"

Will you see a difference will depend on other factors as well. For example some Nvidea based video cards have filter circuits on the analog video output. This will degrade the image to some extent. All else being equal, it is similar to the Coke / Pepsi challenge. Some people will say the image looks better on one over the other, however I attribute some of this to the placebo affect. Many find it difficult to see in a blind test where they do not know what interface is being used. The problem is what may be an appreciable difference to me may not be for you.


Jim Witkowski
Chief Hardware Engineer
Cornerstone / Monitorsdirect.com


Jim at http://www.monitorsdirect.com
 

flamethrower205

Illustrious
Jun 26, 2001
13,105
0
40,780
Thing is, that difference in quality between analog and digital depends on the LCD. For example, the Eizo L371 was the best a few motnhs ago in Digital mode, but certainly not in analog. It depends. Also, GoSharks, u said your caompany fears that DVI may not become the standard. Putting all other things aside (that digital and analog interface cost more, etc.), why don't u consider it the standard? I see that a lot of companies use it. Also, what is teh difference between the L375 and L365, or am I not looking hard enough?

NOS and a <font color=red> Ferrari </font color=red> can be fun! :cool:
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
Flame

I did not say DVI is not a standard. I said that it has only recently become stable enough for my company to take the risk of adding it. Maybe other companies feel the same and that is why the original poster has noticed that there are many products without DVI. For example: there are multiple versions of DVI, DVI-A, DVI-I and DVI-D with multiple revisions, some compatible some not and yes I’m developing products with DVI as we speak.

Jim Witkowski
Chief Hardware Engineer
Cornerstone / Monitorsdirect.com

Jim at http://www.monitorsdirect.com
 

nikolajhendel

Distinguished
Nov 8, 2001
20
0
18,510
Gosharks
Thx for the reply - but i'm still wondering a bit. Part of my question was prize related. Your point in keeping the analog input i can easily understand - but why not just add the DVI interface - and have a monitor that accepts 2 inputs ? I assume this is prize related, so I would like an example of what your f1200 monitor would have cost if you had included an DVI interface.
The dual input is a big plus for me (but of course i'm not in the fortune 500 ;) ) - but the retail companies seldom make these dual input monitors.

Nick
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
Nick

The parts cost is low, less than $30, including the second video cable, DVI-I to analog adapter, however there are hidden costs such as on the assembly line. You increase testing time by 100% because now you must test two inputs instead of one. Updating and user guides etc. Also the marketing guys will always come and say “new feature, we can charge more”.

Also it’s not always about money. If adding the DVI interface would have delayed the project two months, that is a huge amount of revenue that we miss, again for a feature we feel most of our customers will not use. Time to market is sometime much more important than features, there is always a trade off no matter what you decide.

Jim Witkowski
Chief Hardware Engineer
Cornerstone / Monitorsdirect.com


Jim at http://www.monitorsdirect.com