HDMI vs. DVI

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

meodowla

Distinguished
Nov 23, 2008
725
0
18,980

Nowadays monitors with buit in speakers are available. Many BenQ monitors come with Speakers - 2 x 1W each.
BenQ E2200HDA 21.5-inch Full HD LCD Monitor is one of them.
 


Actually it's specific cards.

Many cards come with a simple DVI-HDMI for the video, however some video cards support Audio over DVI to an HDMI adapter. All the HD2xxx, 3K, and HD4K cards support audio over DVI-HDMI via the internal protected path, but require a specialized adapter. nVidia has some cards that offer an SPDIF input (of the 3 types I mentioned above) to then blend the signals to be sent over HDMI, older models (GF9 generation) require a specific adapter, newer G2xx based solutions do it differently inserting it within the signals.

This is more for TVs and AV systems than for connecting to computer monitors, although as mentioned above there are monitors now (and more) that will support 2.1 through HDMI.

If you look at the GTX285 picture here, you can clearly see the SPDIF header for the 2 pin/wire Audio card header as an input right beside the power connectors;

http://images.bit-tech.net/content_images/2009/01/bfg-tech-geforce-gtx-285-ocx-1gb/3.jpg

Zotac and a few others offered early GF9xxx series cards with a similar solution that used coax-SPDIF & Toslink SPDIF inputs.
 


Understand however that what you are arguing drops the reason RC, SS (and I) mention quality, specifically for higher stress situations, like longer distances, or increased frequency (1080P 60fps), where shielding, wire material and diameter matter to carrying the signal cleanly those distances. We're not talking about sub 15ft cables on low freq easily corrected signals.

Like both SS and RC say, in general the medium quality stuff will be more than fine, but don't always think the el cheapos will do for every situation.
But the 3-6ft Monster Cables really have little to no place for 99.9% of users, I'll agree on that.
 

505090

Distinguished
Sep 22, 2008
1,575
0
19,860


Those cards are using a proprietary variation of DVI which is then converted to HDMI, HDMI being the actual cable you are using. Sound is not part of the DVI standard and does not go over a DVI cable (or did i miss something). Yes they are using the connector on the card but that is not the DVI standard. Just cause i can run video to my tv using a piece of speaker wire does not make the piece of speaker wire an rca cable.
 
The thing that seems to be the barrier here is no one is saying Audio over DVI is part of the official spec (as are many things on HDMI until they became v 1.1, 1.2, 1.3, etc) , however you guys said;

505090;
"If you use spdif then the sound is not on the DVI cable and if you convert DVI to an HDMI cable then once again the sound is not on the DVI cable."

Which is plain WRONG. No one said it's wrong because of spec, we said it's wrong, and they USE the spec in their own way, but both data pins and interval data are supported by spec, so how you exploit that is another story, but of course Audio is not specifically part of the spec, but that's the same as 7.1 DTS-HD Master is not part of the PCIe spec, however you could send that over the interface.

And G, your error was similar in the statement regarding nV's updated solution we've discussed before requiring the special adapters. I didn't disagree that ATi's is adapter specific, and even nV's old implementation (really the AIB's implementations) required the specific hardware, however that's not the universal case, nor does it even change that AUDIO can be carried over the DVI port/cable/etc.

That's the point, no one is saying it's part of the spec, which actually had nothing to do with the original discussion of audio or DVI in this or in other threads, they are just saying it is possible, and often in response to non-sequiturs about what can be done, and what was accounted for in the original spec.

That would be like saying General Purpose computing cannot be done on a graphics card because it is not accounted for in either the original OpenGL or DirectX specs. The idea of retasking hardware for non-traditional roles not provided for in the original spec is what gives great solutions like audio over DVI for specialized solutions.

Had either of you stuck to just the spec that wouldn't have caused any issue, but saying it can't be done, ignores that it IS already being done, and done rather well too on something never intended to carry audio, let alone 7.1 HD audio.
 

505090

Distinguished
Sep 22, 2008
1,575
0
19,860
yes the spidf comment was incorrect i assumed a digital link either toslink or coaxial was being used, nut that is apparently not what they meant. My only point is if i convert the dvi port on my computer to hdmi then use a hdmi cable to go to my moniter where it plugs into a hdmi port I do not consider that to to be a dvi connection. Just me though.
 

bpogdowz

Distinguished
Oct 31, 2007
703
0
18,990
I posted this because I got a new LCD which was smaller in size from my last and noticed that the graphic quality in games seemed more blocky and the textures were scrunched together in noticable blocks. So I popped in my HDMI cable used in the old LCD and the blocky texture appearence is just gone. It's like night and day HDMI > DVI.
 

Raviolissimo

Distinguished
Apr 29, 2006
357
0
18,780
http://en.wikipedia.org/wiki/HDMI

HDMI is running a 340 MHz digital signal, and it's a consumer connector (needs to withstand minor consumer "oops", e.g. you're positioning the TV, and bump the cable into the wall with a really tight radius, stressing the connector & the cable.

that's not a trivial design, although given the manufacturing volume, you can make a high-quality connector in the $1-$5 range.

yes, Monster has mark-ups on their mark-ups. that they sue people for using the word 'monster' in a company name is insult to injury.

http://gizmodo.com/gadgets/hdmi-cable-battlemodo/the-truth-about-monster-cable-part-2-268788.php

there's an article about them, they did it right with signal analyzers etc.
 

team_a

Distinguished
May 11, 2009
4
0
18,510
Hi, I posted a similar question a few weeks ago. In theory, DVI = HDMI video-wise, with HDMI having audio aswell.

I tested this with a graphics card I bought recently (ATI Radeon Sapphire 4350):

1. DVI port connected to a HDMI port on my tv via DVI->HDMI cable.

2. HDMI port connected to a HDMI port on my tv via HDMI cable.

3. both cables are from the same cable supplier and are gold plated.

4. I switch channels to try and notice the difference.

I get a better picture with the DVI output and the difference is noticeable.

I tried playing around with the resolutions again yesterday but the HDMI picture does not improve, any ideas?