Question Displays only duplicate each other and windows doesn't recognize that there is 2 displays.

foreksgotmail

Reputable
Nov 28, 2017
16
0
4,510
I have 2 displays that I want to be set to extended but no matter what configuration I have it doesn't seem to recognize that there are 2 displays plugged into the system. It is a Windows 10 machine. The GPU is an ATI Radeon HD 2400 XT and it has a standard Sata AHCI Controller. Let me know if there is anymore information about the machine that I need to give that would be useful. Thanks.
 

foreksgotmail

Reputable
Nov 28, 2017
16
0
4,510
At first it was just that one monitor would not display anything when plugged into the GPU no matter the cable. We swapped out the card and that seemed to fix that problem. Currently one monitor was plugged into the card via DVI cable and the second display was plugged into the motherboard via VGA. I have also tried plugging both monitors into a DVI adapter that goes into the one DVI slot on the GPU (which was how the monitors were set up and working originally). Other than that I have tried about several combinations of cables and placement of said cables. I don't know what else there is to do.
 
I asked those two questions for specific reasons. The first to check they have independent inputs, which is what should be required. Using a splitter only allows for mirroring the display output. If you managed to get it to work on a splitter then I am surprised. There might be a way to enable the onboard graphics to work with the graphics card, though I've only seen mention of it rather than know how to do it myself. I think this is more likely to give you success.

The second is about the maximum pixel count of your monitors, which may exceed what the graphics card can output. The only info I came across was for one of Gigabytes' cards: https://www.gigabyte.com/Graphics-Card/GV-RX24T256H#ov
Judging from those specs on output resolution it wouldn't surprise me if it can't handle two modern monitors. This is just my guess as you don't mention resolution.
 

foreksgotmail

Reputable
Nov 28, 2017
16
0
4,510
Yes sorry I completely forgot about the question on resolution. I had started writing that reply while at work yesterday and finished it this morning. That's my bad. This is no high end PC by any means. I'm sure you can tell from the specs I had given. Its just a lab machine for the library I work in.

The resolutions of the two displays I am currently using to test are set to 1024x768 and 1920x1080.

Does that mean that it could be exceeding the pixel limit? I have been learning about hardware so I am not well versed. I see in the link you provided that the monitor can do resolutions up to 2048x1536. Does that mean collectively the amount of pixels or just that it can go up to that resolution per display?
 
The combined pixel count of your monitors doesn't exceed what the Gigabyte site mentions, so working on the assumption this information is true of the GPU then with the correct cables it should allow for both your displays.

What outputs does the graphics card have?
What cables do you have to use?
What inputs do the two monitors have?

The idea is to figure out the possible combinations to see what works and what doesn't to find a solution.
 

VIVO-US

Honorable
Feb 1, 2017
282
16
10,865
From what I've read about the HD 2400 series cards, the ones with multiple ports can support two monitors, but there are also versions that only have a single port and will only display on one monitor. The HD 2400 is a very old card (2007), so you may want to consider upgrading to a newer model for better multi-monitor support. A Radeon HD 5450 or GeForce GT 710 can be found for under $40, and I can confirm that the GT 710 will even support 3 monitors if you ever needed that many.
 

foreksgotmail

Reputable
Nov 28, 2017
16
0
4,510
There are 2 cards I'm trying to get this to work with:

ATI Radeon HD 2400 XT which has only 1 DVI port
AMD Radeon HD 5450 which has 1 VGA and 1 HDMI
The port that is on the board is a VGA. Both monitors are able to use DVI and VGA but only one can use HDMI.

I have many DVI, VGA, and HDMI cables available.
One adapter that goes from DVI to HDMI
One that goes from VGA to DVI
And another adapter that goes from 2 DVI to 1 DVI

Other machines we have here do use the HD 5450 and the dual displays work with no issue. Both monitors are able to display fine but windows is not recognizing that there is 2 displays so it doesn't give me the option to change it from duplicating the display to extending it. Could it be something to do with the motherboard or something with windows?
 
So with the HD 5450 machines, the monitors are connected by using the VGA and HDMI outputs, correct? And this works fine and extending the desktop isn't a problem, correct?

But with the HD 2400 XT, the monitors are connected to a splitter in the DVI output, correct? And this configuration doesn't allow for desktop extension, correct? And the main focus is on getting this PC configuration to support two monitors? If I'm not mistaken then this will go back to my earlier post about the splitter only allowing duplicate. Think about it this way: the graphics card is only outputting one source through the DVI port at any one time. (Still intrigued as what you wrote earlier implies extension previously worked.)

While doing a bit of research, whether the PC can use onboard/integrated graphics as a second display output will depend on the motherboard itself. It will require going into BIOS and checking what options for video display are available and the motherboard's display output.

The only thing I can think of as a possible get around for the HD 2400 XT with the splitter is creating a custom resolution, and fiddling with the monitors' positioning of the desktop to replicate the illusion of having an extended desktop. (Not sure if that even makes any sense....)