Will HDMI to DVI-D adapter affect image quality?

Sparktown

Distinguished
Jan 28, 2015
129
1
18,695
I've been buying parts for 3 months, but I finally finished my first build. Unfortunately, I made one screw up with cable connections:

My Monitor (Gateway FHX2303L 1080p): DVI-D & VGA connections
My Motherboard (Asrock Z170 ProS): HDMI, DVI-D
My GPU (EVGA FTW Edition Nvidia 950 GTX ): DVI-I, HDMI (2.0), DisplayPort

Basically, I didn't realize the DVI in my GPU was DVI-I, not DVI-D. It's too late to return anything now. So, I was thinking about getting an HDMI cable and HDMI to DVD-D adapter from Monoprice. This way I could connect to my monitor through my mobo or through my GPU.

Will this be okay for my setup? Will have any weird image quality issues? Will the image noticeably degrade at all?

What about a HDMI to DVI-D cable? Is this better or just the same?

Alternatively, I suppose I could get a VGA - DVI-I adapter and use one of hte VGA cords I have around. However, I get the sense that would give me worse image quality. Am I right about that?

Thanks!
 
Solution
DVI-I means DVI integrated - there are extra pins that carry an analog signal so that people can use one of those DVI->VGA passthrough adapters when connecting older monitors that only have a VGA port. DVI-I also carries a digital signal that is used by monitors that have DVI-D ports. So DVI-I can work with both VGA monitors (using a passthrough adapter) and also on DVI monitors (using a DVI-D cable). Your monitor has a DVI-D port and you want to use it because it is digital and therefore doesn't suffer from signal degradation that the older analog VGA ports suffer from. When you connect a DVI-D cable to your graphics card's DVI-I port, only the digital signal is used - and yes that will provide the ideal image quality for your...
for what I understand [but never used] there not a ''stable '' or as good as how the old dvi to vga adaptors were and can be flakey ??

may be best to invest in a cable hard wired as like I use and be done with it ??

example of the cable
http://www.amazon.com/Gefen-HDMI-Cable-feet-Male-Male/dp/B0002CZHN6?tag=viglink20237-20


now dvi-I supports analog signal [4 extra pins around flat pin ] dvi-d just digital [ no 4 pins around flat pin]

now dvi-i female will work with dvi-d male but not the other way around and with out analog

up to 1080 I dought you will see image quality suffering

 


Oh wow. So I can just get a DVI-D cable and plug one of the male DVI-D ends into my GPU's female DVI-I socket? That makes things a lot easier.

Is the image quality the same as if I plugged a female DVI-D port?
 
DVI-I means DVI integrated - there are extra pins that carry an analog signal so that people can use one of those DVI->VGA passthrough adapters when connecting older monitors that only have a VGA port. DVI-I also carries a digital signal that is used by monitors that have DVI-D ports. So DVI-I can work with both VGA monitors (using a passthrough adapter) and also on DVI monitors (using a DVI-D cable). Your monitor has a DVI-D port and you want to use it because it is digital and therefore doesn't suffer from signal degradation that the older analog VGA ports suffer from. When you connect a DVI-D cable to your graphics card's DVI-I port, only the digital signal is used - and yes that will provide the ideal image quality for your monitor. So all you really need is a DVI-D cable (both ends of the cable are male) to plug into your female DVI-I port on the card and the female DVI-D port on the monitor. Does it make sense now?

EDIT: And BTW you might see "dual-link" DVI-D cables - those are for monitors with resolutions higher than 1080p. You only need a "single-link" DVI-D cable because your monitor's max resolution is 1080p.
 
Solution


Thank you so much for clear, in-depth explanation. It really helped! I ended up getting a 10ft dual-link DVD-D cable from Newegg for $7.99. 10ft is a bit much, but I need at least 6ft and it was better deal than 6ft after shipping and taxes.

On a side question: do you know if longer cable length will adversely affect response time (especially for Wacom tablet drawing & gaming)?

I figured it wouldn't. It sounds like 15ft is where people start getting display problems. However, this was all from people talking about HD video playback and not keyboard/mouse/tablet responsiveness. So, I would interested to hear thoughts about this.
 
Electrical signals travel at the speed their electromagnetic wave that is dependent on different factors and varies a lot but here on earth with the materials that we use for cables, they usually travel somewhere between 40% up to 99% of the speed of light. In other words - you aren't going to notice any input lag from a 10ft DVI cable because you are human and simply can't perceive a few nanoseconds difference. There is a lot more going on there than just an electrical signal but it's really your computer system and the monitor itself that are the only important factors in any perceived input lag, definitely not the DVI cable.

Here's some fun reading: https://en.wikipedia.org/wiki/Speed_of_electricity
And: https://en.wikipedia.org/wiki/Velocity_factor#Typical_velocity_factors
 


Thanks. Another very clear, detailed and helpful response. I'll check out those links too.