DVI-D vs DB-15 comparison

Vinny

Distinguished
Jul 3, 2004
402
0
18,780
Are there any articles or comparisons between these two connections?

My brother and I tested both of our monitors (my 215TW and his Viewsonic WX2235WM, both native at 1680x1050, both HDTV capable) side-by-side using both connections with various games, HD movies, and still images... everything looks much better with DVI.

But a forum poster over at another board says that the differences aren't noticeable. Yet, my brother and I both heavily agreed no matter which one of our monitors, the one with the VGA connection was noticeably inferior (washed out colors, tons of Vsync problems, and not as sharp). He claims that both of our monitors are faulty...

I'd like to read a professional comparison of the two to make sure we're not going nuts.


EDIT for new info: Both the monitors my brother and I tested are LCDs. Changed title to fix n00b mistake.
 

lewbaseball07

Distinguished
May 25, 2006
783
0
18,990
dvi is better..it should be..its made to be...noob

thats like saying my 6200le is the same as my 7900gtx in oblivion at like 1920x1080 res....wtf??? learn some more about computers
 

SP73

Distinguished
Jan 21, 2007
50
0
18,630
it depends.

some people report a very noticeable difference, I for one notice DVI delivering a much nicer image.

but others don't notice anything, not sure why, could be their eyes, could be that they were getting a good VGA signal, I don't know, there are any number of other factors.

though DVI is better on a technical level.
 

lewbaseball07

Distinguished
May 25, 2006
783
0
18,990
o sorry iam freaking 13 and ive built like 5 computers already...

sorry for not knowing what iam talking about...i hooked my monitor up in vga then dvi..and dvi has a lot chrisper picture..and more vivid colors..its just plain clearer
 

lewbaseball07

Distinguished
May 25, 2006
783
0
18,990
dvi is a digital signal and vga is analog...people who no stuff no that...digital is always better than analog...the monitor has to take the time to convert the analog signal to digital to display it on the screen
 

bydesign

Distinguished
Nov 2, 2006
724
0
18,980
I've performed the test myself and do quite a bit of video and graphics work and I see no perceivable difference on the PC. My HDMI connection makes a difference on my HD TV. Your video card may make a difference but with the 8800 I see none.

Unless you connect two calibrated displays side by side there no way you could perform the video test. With static images you could photograph the settings with a camera in manual mode. There is no way you could do an accurate comparison by memory.

If it make you feel better go for it.
 

SP73

Distinguished
Jan 21, 2007
50
0
18,630
the GPU is a definite factor, since an analog vga signal can vary heavily from crap to not.


there are many factors, but DVI is undeniably a safer bet, and many people believe it's nicer.
 

kamel5547

Distinguished
Jan 4, 2006
585
0
18,990
o sorry iam freaking 13 and ive built like 5 computers already...

sorry for not knowing what iam talking about...i hooked my monitor up in vga then dvi..and dvi has a lot chrisper picture..and more vivid colors..its just plain clearer

... There is no visible difference. I have all sort sof different connections my my network and diff graphics cards, they all look the same. THis is ~100 LCD's....

Maybe for certain monitors it makes a difference, honestly i think its because you knew which one was hooked up and in your mind you made the DVI better...
 

lewbaseball07

Distinguished
May 25, 2006
783
0
18,990
maybe better monitors have better dvi signal than vga...maybe crappier monitors have better dvi signal than its vga..i dont know

it could also be a physicalogical thing.


beats me. Physicalogical or not its better on mine
 

kmjohnso

Distinguished
Mar 14, 2006
190
0
18,680
FYI..digital is not always better but it's almost always cheaper. Their shouldn't be an appreciable difference if setup properly, but I think VGA is more prone to errors/image variation/etc
 

jt001

Distinguished
Dec 31, 2006
449
0
18,780
dvi is better..it should be..its made to be...noob

thats like saying my 6200le is the same as my 7900gtx in oblivion at like 1920x1080 res....wtf??? learn some more about computers

Complete apples to oranges comparison there not to mention completely out of proportion.


o sorry iam freaking 13...
That's obvious by your immaturity and your complete lack of any sort of grammarical skills. You jumped on the guy for asking a question and were proven wrong. Way to go.

Now back on topic


DVI is always better than VGA, wether it's noticable or not is another story. On my 17" lcd I can't say I see a noticable difference, on my 20" using VGA the color doesn't seem quite as good and the picture seems blurred, to the point where it hurts my eyes to look at, the DVI looks a million times better, it really depends on the quality of the ADC and DAC in use. In most cases it won't matter much.


But simply put, if you have the option always go DVI, if not go VGA in most situations it's not that much different.
 

clarkboyo

Distinguished
Dec 24, 2006
40
0
18,530
Of course DVI is better but the thing is, it costs more also the difference isn't as substantial unless you go higher in size (ie. 22 inchers)
does dvi on a 17 or a 19 inch justify having to spend an extra 30-40 dollars? If you have the money by all means go for it but if you're on the budget and you're only aiming for a good 19inch i don't think you need to go DVI.
 
dvi is a digital signal and vga is analog...people who no stuff no that...

STFU n00b!

"People who no stuff no that" huh?
Yeah, maybe that's the problem, they don't know as mcuh as they think they do.

Ask yourself what makes the DVI look better, is it because often dvi setups autodetect the panel and then automatically adjusts the settings to a better default level, thus the n00bs are better of not having to play with colour temperatures, etc? Or is it because you're looking at a digital panel that poorly converts an analgoue signal? Is it because the card itself sucks?

digital is always better than analog...the monitor has to take the time to convert the analog signal to digital to display it on the screen

Not always, and that statements proves you don't know jack. Try running the cable over greater distance and then you tell me which is better. VGA can give you useable cable lengths much longer than standard DVI. So in that case the loss due to cable length or even bad TMDSs may be less than the loss due to conversion errors or the vga analog signal noise.

And it's not just 'analog vs digital' , BNC connected analogue can pump out far higher resolutions and colour depths than even dual-link TMDS DVI, and over really large distances. Of course if you have quality repeaters then you can improve DVI, but that goes for VGA too.

Seriously, both are good and bad depending on their use, you own personal experience (though vast with that handful of computers :roll: ) is not indicative of much other than that. And considering the crap monitors most people compare with at home anyways, I wouldn't trust anyone else's perception of good/bad.

Betcha the VGA and DVI-A images on my properly calibrated P260 here at work look crisper and more true than any DVI connected image you put on your cheap panel, but it won't be the DVI/VGA alonge that would have the largest impact.

The only thing you said that makes sense is that the OP shouldn't care what people think, but calling him a n00b when you're obviously a FAQin' dumbA$$ is laughable. At 13 you should consider SingTFU and listening more than posting.

Seriously, go play in traffic kid!
 

tool_462

Distinguished
Jun 19, 2006
3,020
2
20,780
i believe DVI to be superior. why else would the industry move to a digital signal?

I suppose SATA II is superior to SATA I? :p

PCI-e 8x < PCI-e 16x?

Same situation, yes the newer tech can be better, but often it goes unnoticed that there is no actual performance gain.


I don't see any difference in DVI on my monitor, I'm sure some people do and some don't. For reference I have a Westinghouse 22" Widescreen 5ms response time. 1680 x 1050 native.
 

niz

Distinguished
Feb 5, 2003
903
0
18,980
Actually many people that are very knowledgeable on the subject will argue there is no difference. Learn more about computers noob.

You should check your own facts before calling someone else names when they're right.

There is a massive differerence. DVI is digital. RGB (what he's calling VGA) is analog.

A picture from an analog source on an LCD panel has gone trough two lossy conversions. D to A by the video card and A to D in the monitor. It has also been degraded (twice) by quantization, and also signal loss in the cable and also its timing signals are analog therefore the H and V sync is less accurate.

A digital image has been through no conversions and suffers no loss as both it and its timing are dicrete signals.

Its exactly like comparing the sound quality of DVD to cassette tape. One is digital one is analog.

Anyone who says RGB looks as good as digital is probably using a crt monitor (which by its nature is an analog device) or maybe a resolution that isn't the LCD panel's native resolution. In either case, they aren't getting the best beacuse of basic mistakes, so are techincally clueless and most definately wrong.
 

Valerarren

Distinguished
May 20, 2006
37
0
18,530
DVI will usually look better out of the box, but if you tweak the settings on the analog one, you can typically make it so it appears to have the same quality.

If you are complaining that things look fuzzy on the analog one, play around with the monitors settings, the auto-optimize for analog on some LCDs is better than others.
 
i believe DVI to be superior. why else would the industry move to a digital signal?

I suppose SATA II is superior to SATA I? :p

OOOOooooohhh! Watch out don't call it SATAII or the SATA-IO guys will getcha;
http://www.theinquirer.net/default.aspx?article=37254

Can't rememebr the thread but there was one of them nutterz here a few weeks back.

Of course they forget that it can still be an SATAII drive in the sense that it's an SATAII devise spec drive. :roll:

BTW, the iRAM drive from Gigabyte can actually saturate SATA 1/1.1 and might benifit from SATA2/II/3G.... speed boost.
 

niz

Distinguished
Feb 5, 2003
903
0
18,980
I've performed the test myself and do quite a bit of video and graphics work and I see no perceivable difference on the PC. My HDMI connection makes a difference on my HD TV. Your video card may make a difference but with the 8800 I see none.

Unless you connect two calibrated displays side by side there no way you could perform the video test. With static images you could photograph the settings with a camera in manual mode. There is no way you could do an accurate comparison by memory.

If it make you feel better go for it.

Are you using and LCD panel or a CRT? I know lots of graphics pros still prefer CRT's because of the more natural colour gradient. If you're using a CRT then you won't see hardly any difference between RGB and DVI as CRT's are analog by their nature.

Also if you're mostly looking at graphics and photos its no wonder, as there's no sharp edges compared to say text.

If you're using an LCD: To see the problem everyone else sees, make sure you set the windows screen res. to your LCD's native resolution, and make sure you turn off any crappy microsoft font-blurring technology like cleartype, and boot up with the DVI cable only connected.

Now look at some small text. You should see a lovely sharp picture. Now reboot with just the RGB cable connected. you will see lots of blurriness around the text by comparison.
 

niz

Distinguished
Feb 5, 2003
903
0
18,980
Betcha the VGA and DVI-A images on my properly calibrated P260 here at work look crisper and more true than any DVI connected image you put on your cheap panel, but it won't be the DVI/VGA alonge that would have the largest impact.

More of grapeape's usual standard of crap.

The p260 is a 19.1 inch viewable CRT with a current value of $39.95 on ebay so its nothing to boast about.
http://popular.ebay.com/ns/Computers-Networking/Ibm+P260.html

Of course you won't see a difference between DVI and analog on a CRT.

And if you think any CRT could ever be as sharp as even a cheap digital panel then you really are clueless.
 
VGA is the wrong term to be using guys.


EVGA, SVGA, WSVGA, these are all standards. RGB input is what the OP is talking about.

Also, the 3 color 5 cable BNC analog (with v/h inputs) look far better to a graphics artist, on a good analog monitor then the digital monitor that over hypes its colors and contrast.

It all depends on application of both digital / analog signals. Both have positives and negatives associated with each technology.


As for the 13yr old SNOT.
What about an Analog camera? You saying a digital camera can take better pictures then a regular camera? I don't think so. Sure the brain inside a digital camera helps people take better pictures by setting the f-stop and shutter speed for them, but if used properly, an SLR camera will look much better then a digital SLR in most cases.


edit: though on smaller images, digital is almost caught up if not near equal. However, performance is still poor.
 
You should check your own facts before calling someone else names when they're right.

Right according to what? And his name-calling was in response to the previous poster.

There is a massive differerence. DVI is digital. RGB (what he's calling VGA) is analog.

Massive difference, eh !?! What like 2.7 x 10^9 or less?
Deing digital or Analogue itself isn't an issue, it depends on the quality of the components and how they are being used. What's better, digital or analogue audio (depends on the quality and rate doesn't it?), same goes for video. You can have digital artifacts just like analogue artifacts. And like I said both have their weak links.

A picture from an analog source on an LCD panel has gone trough two lossy conversions. D to A by the video card and A to D in the monitor. It has also been degraded (twice) by quantization, and also signal loss in the cable and also its timing signals are analog therefore the H and V sync is less accurate.

And yet you can still get a better image from a quality VGA cable and good filtered output than from a poor TMDS on crap cables outside their range. The potential is better for DVI within a given range, but it doesn't always result in a better picture, which is what the discussion has been sofar. So if someone came in saying their VGA was giving them a better picture than their DVI your argument and that of LB7 would be tha it's them because DVI is better than VGA. Which isn't always the case as many of us with experience with both know.

Its exactly like comparing the sound quality of DVD to cassette tape. One is digital one is analog.

Obviously you've never really looked at all the options when authoring a DVD. Make the bit-rate low enough and your DVD will look worse than a VHS tape (let alone Super-Beta), it isn't just digital vs analogue alone.

Anyone who says RGB looks as good as digital is probably using a crt monitor (which by its nature is an analog device) or maybe a resolution that isn't the LCD panel's native resolution. In either case, they aren't getting the best beacuse of basic mistakes, so are techincally clueless and most definately wrong.

Niz you prove over and over that you just don't know what you're talking about. It's not because of 'mistakes' unless they are using the wrong hardware, and that can happen for both DVI and VGA. Seriously, if you knew what you were talking about, you'd take a more centrist position, and know that the output image depends on alot of factors many of which are heavily influenced by the user (be it requirements or limitations).

More of grapeape's usual standard of crap.

Awww poor baby, just because I keep schooling you NIZ, no need to think people post crap like you do.

The p260 is a 19.1 inch viewable CRT with a current value of $39.95 on ebay so its nothing to boast about.
http://popular.ebay.com/ns/Computers-Networking/Ibm+P260.html

eBay!?! C'mon, you're gonna argue eBay prices? We're talking quality, not prices dumbA$$ or did you think this was an economics discussion I was schooling you in yet again. The P260 is among the best Trinitrons made, and no consumer level LCD can compete with the quality. A good (and expensive) LED BacklitLCD panel can compete, and the extremely expensive HDR panels, but then again, that's the panel, not the connection, like I said. Your inability to distinguish between the two shows your ignorance.

Of course you won't see a difference between DVI and analog on a CRT.

That's not the issue now is it, it since there is no digital on a CRT DVI-A is still analogue or did you not know that. What I was saying is that it's not the interface that always matters, so comparing a DVI-I connected LCD from 4-5 years ago isn't going to guarantee a better picture, there's more involved.

]And if you think any CRT could ever be as sharp as even a cheap digital panel then you really are clueless.

Well based on your previous posts and the fact that I work with both, as well as with higher end panels, I KNOW you're clueless, just look at all your suppositions and how many times you got them wrong.

You THINK I'm wrong, everyone else, including myself, can clearly see you're wrong! Maybe it's the quality of the connection between your brain and your eyes/keyboard. :tongue:
 
VGA is the wrong term to be using guys.


EVGA, SVGA, WSVGA, these are all standards. RGB input is what the OP is talking about.

Yeah it would helpp saying DB-15, but few people know the term. VGA has become synonymous.

Also, the 3 color 5 cable BNC analog (with v/h inputs) look far better to a graphics artist, on a good analog monitor then the digital monitor that over hypes its colors and contrast.

It all depends on application of both digital / analog signals. Both have positives and negatives associated with each technology.

EXACTLY!

Glad someone else understands the concept.
In the end, go with what works for you and your application.