Wow, you're really not getting much luck receiving answers to your posts.
This one is fairly straightforward though, so I'm happy to chip in. The only difference between "I" and "D" is that the "I" ports contain 4 additional pins for carrying an analogue signal. The diagrams under the "connector" section on wikipedia are helpful:
https://en.wikipedia.org/wiki/Digital_Visual_Interface
You can see the four additional pins which straddle the horizontal bar on the left of the connector (as shown in the diagrams).
Those 4 pins carry an analogue signal. The beauty of this solution is that those 4 pins can simply be connected to the relevant pins on a VGA connector and everything just works. There's no need for any signal conversion. That allows you to use a simple, passive (which means cheap) adapter to convert a DVI-I port to VGA.
I might be wrong here, but I don't actually believe there is such thing as a "DVI-I" cable (either dual or single link). The whole point of the DVI-I output is that it's more flexible. It allows you to send a digital signal to a monitor that supports it over a DVI cable. Or, if you have an older monitor or projector that only has a VGA input, it allows you to use a cheap passive DVI -> VGA adapter and connect it on the same port. In other words, it removes the need of having separate VGA and DVI ports. My understanding is that this feature helped aid the adoption of DVI ports on computers and graphics cards. They could include the (then) new DVI port, but bundle an extremely cheap adapter to ensure those without DVI-capable monitors could still connect their displays.
There isn't really any point having a DVI-I cable, as the monitor receiving the signal has no use for both a digital and analogue signal. If it has a DVI port, it'll use the digital signal which would make the analogue signal entirely pointless.