Question Connecting monitor with dvi or vga cable

Weathered

Reputable
Aug 2, 2017
184
3
4,685
0
I have a 24" 1080p monitor that I currently use on my main desktop, have it hooked up using HDMI. I was going to do a little switching around and hook it up to a different desktop. The monitor has dvi, vga and hdmi ports. I tried hooking it up using dvi or vga with dvi adapter, gpu doesn't have vga, and it would not work. Says can't find HDMI signal. Now I can use HDMI and it will work. But trying to figure out why it will not work with something other than HDMI. With it hooked up with HDMI, I went into monitor menu and switched to dvi but that didn't work either. And can't get to monitor menu unless I use HDMI and get the computer display to come up. Am I wrong in thinking that monitors should auto detect what you are connecting to? Monitor is Asus VH242H
 
Last edited:

Ralston18

Titan
Moderator
Hopefully just a matter of getting into the monitor menus and selecting HDMI.

Asus User Manual Link

https://www.asus.com/me-en/Commercial-Monitors/VH242H/HelpDesk_Manual/

Zip File

Page 1-4 explains how access the monitor menu.

I took a look and Section 3.1 provides instructions regarding the use of OSD (On Screen Display).

Page 3-3 explains input selection with a caveat "only for some models".

Noted the manual dates from October 2008.

Consider the connection to the second desktop as a completely new install. Start from the beginning.
 

Weathered

Reputable
Aug 2, 2017
184
3
4,685
0
Hopefully just a matter of getting into the monitor menus and selecting HDMI.

Asus User Manual Link

https://www.asus.com/me-en/Commercial-Monitors/VH242H/HelpDesk_Manual/

Zip File

Page 1-4 explains how access the monitor menu.

I took a look and Section 3.1 provides instructions regarding the use of OSD (On Screen Display).

Page 3-3 explains input selection with a caveat "only for some models".

Noted the manual dates from October 2008.

Consider the connection to the second desktop as a completely new install. Start from the beginning.
On second pc, it works using HDMI as that is what was being used on first computer and HDMI is selected in monitor menu.

The issue is not being able to use it with vga or dvi cable. It works when hooked up HDMI and that is the only way I can get into the monitor menu, I went into monitor menu and selected DVI, unhooked HDMI and hooked up DVI and no display. But instantly works with HDMI hooked backed up.

I have read through that manual. I can access monitor menu fine and know how to get to the input selection screen if I can get the desktop display to come up but can't otherwise. Not sure if I should be able to access monitor menu without having to have a working pc display. I did not see anything in manual about that.

Maybe I am going through this trouble for nothing as it works on HDMI. Should just use HDMI and be done with it. But am curious as to why it wouldn't work on vga or dvi and in case something comes up where I can't use HDMI for whatever reason and have to use dvi or vga
 

Ralston18

Titan
Moderator
You should be able to directly access the VH242H menu screens via the monitor's control panel buttons.

Just to be able to change the video input port if using VGA or DVI becomes necessary.

(And remember that there are different types of DVI cables.)

There should also be a reset option to restore the monitory to factory settings - but I would leave that reset alone for the time being.

= = = =

You may need to have older drivers to go "back" to VGA or DVI.

Try downloading the applicable drivers via the Asus website:

https://www.asus.com/support

I made a quick search and found Windows 7 drivers (32 and 64 bit) for the monitor. Likely not applicable.

You may be more successful because you have more details about the VH242H available to help the search.
 

geofelt

Titan
Your monitor has a dvi-d(dial link) input.
They look like this:
https://www.google.com/imgres?imgurl=https://upload.wikimedia.org/wikipedia/commons/thumb/f/fb/DVI_Connector_Types.svg/1200px-DVI_Connector_Types.svg.png&imgrefurl=https://commons.wikimedia.org/wiki/File:DVI_Connector_Types.svg&tbnid=Hm3LdiKAYuaUfM&vet=12ahUKEwjC4uH0-772AhUBO98KHfbOB4UQMygAegUIARDAAQ..i&docid=AE5seT7ieQcq0M&w=1200&h=2348&q=dvi wiki&ved=2ahUKEwjC4uH0-772AhUBO98KHfbOB4UQMygAegUIARDAAQ

Look at your cable source and target pin layout.
The ones with the 4 pins over a horizontal one are for analog output.
It is also possible that the dvi cable is defective.

I think monitor drivers are mostly for documentation and have no functional value;(I could be wrong here)
 

Satan-IR

Distinguished
Ambassador
Does it not work at all unless you connect with HDMI and change source? I mean if you connect through DVI to VGA port on monitor and boot/start Windows or whatever OS you have it doesn't work either?

If it only doesn't work when it's HDMI first and then you unhook that and connect DVI to VGA while it's on might be because sometimes there's a problem in VGA with hot plug detection. Hot plug detection is supported for DVI, HDMI, and DisplayPort. Sometimes EDID dommunication problems happen on some devices (displays). This is because of inconsistencies in the implementation of HPD (hot plug detection) signaling between devices from different manufacturers.

To answer your question in first post, EDID is how the monitor/display/TV talks to the source. It is a standardized way for a display to communicate its capabilities to a source device. From version 1.3 EDID provides Monitor Name, ID, Model, Serial Number, Display Size, Aspect Ratio, Horizontal Scanning Frequency Limits, Vertical Frequency Limits, Maximum Resolution, Gamma And Suppurated Video Resolutions And if I'm not mistaken it was actually first developed for use between analog computer-video devices with VGA ports. It's now implemented for HDMI, DVI and DisplayPort.
 

Weathered

Reputable
Aug 2, 2017
184
3
4,685
0
Your monitor has a dvi-d(dial link) input.
They look like this:
https://www.google.com/imgres?imgurl=https://upload.wikimedia.org/wikipedia/commons/thumb/f/fb/DVI_Connector_Types.svg/1200px-DVI_Connector_Types.svg.png&imgrefurl=https://commons.wikimedia.org/wiki/File:DVI_Connector_Types.svg&tbnid=Hm3LdiKAYuaUfM&vet=12ahUKEwjC4uH0-772AhUBO98KHfbOB4UQMygAegUIARDAAQ..i&docid=AE5seT7ieQcq0M&w=1200&h=2348&q=dvi wiki&ved=2ahUKEwjC4uH0-772AhUBO98KHfbOB4UQMygAegUIARDAAQ

Look at your cable source and target pin layout.
The ones with the 4 pins over a horizontal one are for analog output.
It is also possible that the dvi cable is defective.

I think monitor drivers are mostly for documentation and have no functional value;(I could be wrong here)
I have 2 dvi cables and both are dvi-d single link. Neither has the 4 pins over the horizontal one.

Does it not work at all unless you connect with HDMI and change source? I mean if you connect through DVI to VGA port on monitor and boot/start Windows or whatever OS you have it doesn't work either?

If it only doesn't work when it's HDMI first and then you unhook that and connect DVI to VGA while it's on might be because sometimes there's a problem in VGA with hot plug detection. Hot plug detection is supported for DVI, HDMI, and DisplayPort. Sometimes EDID dommunication problems happen on some devices (displays). This is because of inconsistencies in the implementation of HPD (hot plug detection) signaling between devices from different manufacturers.

To answer your question in first post, EDID is how the monitor/display/TV talks to the source. It is a standardized way for a display to communicate its capabilities to a source device. From version 1.3 EDID provides Monitor Name, ID, Model, Serial Number, Display Size, Aspect Ratio, Horizontal Scanning Frequency Limits, Vertical Frequency Limits, Maximum Resolution, Gamma And Suppurated Video Resolutions And if I'm not mistaken it was actually first developed for use between analog computer-video devices with VGA ports. It's now implemented for HDMI, DVI and DisplayPort.
Even when I changed the input source on the monitor to dvi when it was hooked up through HDMI, still didn't work when I switched cable to dvi.
 

geofelt

Titan
Look at your monitor dvi input.
Does it have 24 pins in which case it requires a dual link dvi cable.
The graphics card outputs on all 24 pins, and I suspect that is what your monitor needs.
Try a dvi -d dual link cable with all the needed pins.
 

Satan-IR

Distinguished
Ambassador
Even when I changed the input source on the monitor to dvi when it was hooked up through HDMI, still didn't work when I switched cable to dvi.
Well that's what I meant to ask and did. What happens if you hooks the PC to the DVI port of the monitor and boot? From your posts it seems you just checked changing ports and cable from HDMI to DVI (orVGA?) while PC is on. Although if the situation is passive DVI-VGA adapter and no analog signal in cable that wouldn't make any difference.

Where does the adapter you mentioned go?

I have 2 dvi cables and both are dvi-d single link. Neither has the 4 pins over the horizontal one.

So it's DVI port on graphics card and a single link DVI-D cable. If it doesn't heve the 4 analong pins, as geofelt said above, it wouldn'y carry an analog signal and a DVI to VGA adapter wouldn't work unless it was an active one. Dual link DVI-D port (both ends) would theoretically work with a single link DVI-D cable too unless the picture resolution and refresh rate are higher levels than a single link can handle.

Try settings the display settings on PC to full HD at 60Hz (if they are higher now) and see if your single link DVI-D cable works or not. If not, as said above, you might need to get a dual link DVI-D cable.
 

ASK THE COMMUNITY

TRENDING THREADS