DVI cable yields "no signal", but DSub with DVI converter works?


Aug 21, 2013
Hey all.

I have a brand new BenQ XL2420T monitor and my MSI Twin Frozr III GTX 560 Ti 448 GPU. When I try to connect the DVI cable included with the monitor from the monitor into the GPU, I get a "no signal" message, though the cable is detected by the monitor.

However, when I connect a DSub cable into the monitor and then put a DVI converter on the other end and plug it into my GPU, I get a signal.

What's going on here? And am I losing any quality having to use the DSub cable? Will I still benefit from the 120Hz of my new monitor?

Thank you.



Most GPUs contain an interface known as Dual-Link DVI-I for maximum compatibility. This is a combination of DVI-A and Dual-Link DVI-D. DVI-A signalling is identical to VGA (DE-15), differing only in connector design. A passive adapter is all that stands between DVI-I and VGA, hence why they're so abundant and included with almost every graphics adapter out there.

DVI-D is all together different and uses differential digital signalling rather than single ended analog signalling. Dual-Link DVI-D has twice as many differential pairs as Single-Link DVI-D, which provides twice as much bandwidth at any given pixel clock. This is necessary for high resolution displays, or when using colour depths beyond 24bpp.

So, that single passive DVI-A -> VGA adapter switches the interface on both the GPU and the Display. Clearly, one of the digital transceivers isn't working, but the digital to analog converter on the GPU is working, as is the analog to digital converter on the display.