Have a samsung 24" High Def TV/Monitor (1920 x 1200). Tried both, Used a DVI to HDMI cable and a DVI cable. Didn't notice any diff in picture quality. The only diff was that with the DVI -> HDMI cable was that sound was enabled on monitor speakers without using a audio cable.
 

505090

Distinguished
Sep 22, 2008
1,575
0
19,860
it's the same video signal on both. HDMI adds sound and the new HDMI will add ethernet when they come out. Unlike old analog cables both of those are digital signals. which means the cheap cables are just as good as the expensive ones.
 
^^ QFT

DVI and HDMI carry the same video signal.
HDMI can also carry sound (up to 7.1 lossless)
All signals are digital, so a $0.99 wire will give the same quality as a $49.99 one (I'm looking at you, Monster Cables)
 

AKM880

Distinguished
Apr 16, 2009
2,330
0
19,810
DVI can carry sound on some GPU's, if its on Nvidia you'll need a SPDIF in onto the GPU. If its ATI then you can just use the DVI to HDMI adapter.
 
Concur with jaguarskx - on quality of parts.
Like to correct a misconception.
1. Cheap cables are as good as expensive cables. Unlike my son who sent me a $120 HDMI cable to check out a HDMI problem, I tend to use mid priced cables. There is a difference.

2. A 99 Cent piece of wire will work as it Digital - simply not true.

A digital signal is made up of square waves. The leading and laging edges represent a very high frequence (higher than 6 MHz bandwith of Analog TV) and are easily distorted. Factors that affect this, and the cost, are primarily shielding which can effect interwiring capacitance and the RCL value of the signal cable. As you move up from 480i to 1080P these factors be come more critical.

Repeat I'm not an advicate for overpriced, expensive HDMI/DVI cables, But I do shy away from thoes "El-Cheap" ones.
 

daedalus685

Distinguished
Nov 11, 2008
1,558
1
19,810


You'd have to have a rediculously crappy cable to get bit loss from noise and distortion, but you are correct. Really though, if you buy your standard DVI cable from the computer store for 10 dollars you won't notice a difference in getting a gold plated one for 60. If you are using several unshielded wires you found on an old phone to go from pin to pin on the gpu to monitor you might have more than just a problem with cabling in your life ;)

I think everyone knows there is a point that cheep is not ideal, even in digital. What really drives me nuts are places like Best Buy where they tell people that they shouldn't use the cables that come with the tv and should use the 150 buck monster cable hdmi or else the quality of the picture won't be as good.
 

505090

Distinguished
Sep 22, 2008
1,575
0
19,860


If you use spdif then the sound is not on the DVI cable and if you convert DVI to an HDMI cable then once again the sound is not on the DVI cable. The reason you can convert DVI to HDMI is that its the same video signal HDMI is the upgraded DVI. HDMI added sound to DVI, and the next gen will include networking as well while using the current HDMI connector.

And yes do not get the ghetto cable that cracks when you try to install it but anything of decent build will suffice. I lost the link but they have done testing and there was no difference in digital cables.
 


I heard that Monster Cable is notorious for ripping people off, and for suing any other company with the word "Monster" in their name...
 

505090

Distinguished
Sep 22, 2008
1,575
0
19,860
All of the cable companies rip people off. Wholesale on a $80-$120 cable is only $7-$20. I've done jobs were we made more money on the cables than on the 50inch tv or 7 channel receiver.

FYI
We are required to sell at their prices or we lose our contact and have to pay as much as the rest of the world.
 

eklipz330

Distinguished
Jul 7, 2008
3,033
19
20,795
monster cables are mega garbage, its almost like the bose brand... they both perform as well as they're 1/2 the price counterparts

their cables are strong though, but not worth the premium

anyway, hdmi is basically dvi with the ability to transfer sound, and the funny thing is there are audio-dvi cables out there
 

edeawillrule

Distinguished
Dec 15, 2008
627
0
19,010
The gold plated ones are supposedly resistant to oxidization that usually takes place over time with normal plain metal cables. The HDMI cables I usually use are the $32 6' gold plated Vizio HDMI 1.3b ones from wal-mart lol (That is a good price isn't it?)
 


Not really. 0 or 1, high or low; those are your only choices.

S_digital.PNG

A digital signal waveform: (1) low level, (2) high level, (3) rising edge, and (4) falling edge.

Source: Wikipedia.org

Sure, signal corruption can occur, but suffice it to say, its not that likely, as theres only two possible states the signal could ever be. An analog signal would have interfearance issues though...


As for DVI audio, the audio is carried from the GPU using a custom DVI-HDMI converter; using a standard converter will not give the audio signal. I'm guessing NVIDIA/ATI use a slightly different DVI pin layout (and extra pin or two) to carry the audio signal when one of their converters are used. Standard DVI implementations do not carry an audio signal, however, and most devices wouldn't know what to do with a audio signal over DVI even if you managed to carry one.
 

Hindesite

Distinguished
Apr 8, 2009
67
0
18,630
Like others said, they're basically the exact same quality. Only difference is HDMI can carry an audio signal while DVI cannot. That's more for TVs though then it is computer monitors.
 

Tell me about it. My moniter actually has a Optical Digital output (for audio over HDMI), but I found out after the fact that DD/DTS signals would not carry properly, so I needed an Optical Digital switch as well as a HDMI switch...
 
I knew I would get some negative feedback – Planned on keeping my fingers away from the key board. He__ I’ll probably catch more.

I am NOT saying cheap HDMI cables will not work, and I did not recommend the very expensive/overpriced cables. However I stand by what I said – There is a difference, and as the old saying goes you get what you pay for.

Gamerk316
You gave a very good example and explanation, and from a very good source. The waveform you illustrated would indeed be fine. The problem is that the distortion is not trapezoid. The leading edges/lagging edges have a slope that looks like an RC charge/discharge curve and your waveform does not show “no man’s land” (Old definition roughly voltage that is between 1 volt and 4.75 V for TTL logic). What happens is that depending on the PW and the RCL value the raising edge may not rise high enough to trigger a “1”, nor may it fall enough to provide a “0”. This occurs when the PW is =< 4 TC’s.

Edeawillrole
“The gold plated ones are supposedly resistant to oxidization….” Take out the word supposedly and you win the “gold” metal. The reason for using gold is NOT that it is a good conductor; in fact it is a 3rd rate conductor (Ag and Cu being better). But gold is very slow at oxidizing and Metal oxides are insulators.
A side benefit of gold is that it a Slippery” metal – low coefficient of fiction compared to say copper on copper which allows for easier mate/demates on tight connections.
The use of tinned contacts are just as good for 1 to 5 years depending environmental conditions and weather both contacts are of the same metal or dissimilar metals; But at some point they will oxidize and create problems when working with high frequencies/sq waves.

My computer knowledge may be average, but my electronic background is a might on the high side.
 




Dudes, you can and have been able to carry audio over DVI for many years, the SPDIF reference was not to an additional connector to carry the audio separately it's in reference to the way in which ATi and nV's chips handle the audio processing, with the ATI chips using a protected internal path, and the nVidia solutions taking and external SPDIF input (either 2 wire from the audio card header, coax or toslink) and then adding them to the TMDS signal to output on the DVI connector where it can then be sent either directly to an adapter or even carried on a DVI cable/connector and then to the adapter.

What matters is how people use the DVI standard which has bult in flexibility for additional data channels, and both ATi and nVidia exploit different techniques to send the audio through DVI. ATI uses the data channels, nVidia inserts the audio signal between the video signal, both methods are supported.

Seriously, have you guys missed the whole HD2K, 3K, and 4K generaion of graphics cards, and nV's high end G200 cards? This isn't new.

Speaking of not new.....



I told you this last time, nVidia does NOT need a special converter, only ATi does, because of the way they send the signal, and you can even do it over a standard cable and generic adapter. Have you ever even tried these solutions before commenting on them? You should research it before posting again saying "I'm guessing" just like so many times you post, like in the previous thread on the subject.

Anywhoo...... the most important difference between DVI and HDMI is the higher bit-depth support in spec for HDMI 1.3 with 'deep color' and it's support for 12bit and 16bit per channel colour, whereas DVI is limited by spec to 8 and 10 bit (10bit being the low end of deep color) both of which are also supported by HDMI.

Of course just like ATi and nV tweaked with the standard DVI interface, you can send 12 bit per channel colour over DVI which is already done for dedicated hardware that supports it like some of SONY's broadcast gear.
 


Don't worry about Gk316, he gets more wrong than he gets right, and using Wiki as a justifier doesn't help.

Trying to explain the analogue properties of signal transmission would be lost on him because it's digital, it's all digital, end to end, as if resistance wasn't a physical property that had a distinctly analogue effect on a signal, same with signal propagation over the air regardless of signal type/source it travels through and analogue medium of some kind and thus distance plays a major role, and can be affected by the quality of the medium over that distance.

Like you I don't recommend high quality cables for the majority of people, but if they're sending HDMI over distances greater than 25ft, then it starts to make a difference, and over 50ft you don't just get small artifacts like sparkles, you'll often see major issues or even no picture due to poor signal quality.
 


What you are arguing, is that the cable itself could have a noticable effect on a data stream consiting of nothing but 0's and 1's, which is not the case. Sure, beyond a certain distance, thanks in part to strength of signal, data will end up getting lost (resulting in blank or unchanged pixels), thats part of resistance, and I'm not arguing that point. I am arguing that, distance restrictions aside, a cheap HDMI cable will almost never end up corrupting a signal to the point of noticable effect. Assuming every HDMI cable is made to support HDMI spec, there is minimal chance data corruption will occur on a standard 25ft cable. (Hence, why most cables are limited to just 25ft)

If that were the case, the setup I have (going through two seperate HDMI switches (with singal boosters, obviously)), I would be seeing/hearing plenty of artifacts, but thats simply not occured.
 

505090

Distinguished
Sep 22, 2008
1,575
0
19,860


I could be confused seeing as i haven't used a monitor with integrated speakers since my 486, but are you referring to those cards that come with a DVI to HDMI adapter?