505090 :
If you use spdif then the sound is not on the DVI cable and if you convert DVI to an HDMI cable then once again the sound is not on the DVI cable. The reason you can convert DVI to HDMI is that its the same video signal HDMI is the upgraded DVI. HDMI added sound to DVI, and the next gen will include networking as well while using the current HDMI connector.
Hindesite :
Like others said, they're basically the exact same quality. Only difference is HDMI can carry an audio signal while DVI cannot. That's more for TVs though then it is computer monitors.
Dudes, you can and have been able to carry audio over DVI for many years, the SPDIF reference was not to an additional connector to carry the audio separately it's in reference to the way in which ATi and nV's chips handle the audio processing, with the ATI chips using a protected internal path, and the nVidia solutions taking and external SPDIF input (either 2 wire from the audio card header, coax or toslink) and then adding them to the TMDS signal to output on the DVI connector where it can then be sent either directly to an adapter or even carried on a DVI cable/connector and then to the adapter.
What matters is how people use the DVI standard which has bult in flexibility for additional data channels, and both ATi and nVidia exploit different techniques to send the audio through DVI. ATI uses the data channels, nVidia inserts the audio signal between the video signal, both methods are supported.
Seriously, have you guys missed the whole HD2K, 3K, and 4K generaion of graphics cards, and nV's high end G200 cards? This isn't new.
Speaking of not new.....
gamerk316 :
As for DVI audio, the audio is carried from the GPU using a custom DVI-HDMI converter; using a standard converter will not give the audio signal.
I told you this last time, nVidia does NOT need a special converter, only ATi does, because of the way they send the signal, and you can even do it over a standard cable and generic adapter. Have you ever even tried these solutions before commenting on them? You should research it before posting again saying "I'm guessing" just like so many times you post, like in the previous thread on the subject.
Anywhoo...... the most important difference between DVI and HDMI is the higher bit-depth support in spec for HDMI 1.3 with 'deep color' and it's support for 12bit and 16bit per channel colour, whereas DVI is limited by spec to 8 and 10 bit (10bit being the low end of deep color) both of which are also supported by HDMI.
Of course just like ATi and nV tweaked with the standard DVI interface, you can send 12 bit per channel colour over DVI which is already done for dedicated hardware that supports it like some of SONY's broadcast gear.