Hello all,
I've scoured previous posts but haven't found anything that quite matches my dilemma. I hope someone here will recognize the issue. Thanks in advance.
Computer:
i5 chip running Windows 7 64-bit
SSD and HDD
6 GB RAM
Sapphire Radeon HD 5450 graphics card
3 monitors -- one HDMI, one DVI, one VGA
I am running three monitors on this for basic office use (no gaming or heavy graphics apps). The DVI and HDMI monitors are connected to the card. The VGA monitor is still connected to the mobo.
On Friday I replaced the VGA cable with a longer one for more desktop flexibility. All three monitors blinked but went back to the way they were after a couple seconds. I shut the machine down (I thought) on Friday. When I got in today, there was a black screen saying I needed to hit enter to complete the changes. Having no real choice, I did and Windows started as usual. But only the HDMI and DVI monitors came on.
The graphics card is a light-duty item and I knew it couldn't drive all three monitors in legacy mode. When I plug the VGA monitor into the card, that works but the DVI is disabled. The screen resolution tool shows either/or.
I could be way off base but I think the black screen changes disabled the Intel graphics display adapter that comes with the chip leaving only the Catalyst. The AMD is the only one showing now in Device Manager so I can't swear that was it. I had what could be a related problem getting rid of the Intel graphics on an AMD install with my home computer but that has only gotten me so far with this problem.
Long story short, if anyone recognizes the issue or has a better idea for running three monitors on this box, I'd love to hear them. Thanks, sh
I've scoured previous posts but haven't found anything that quite matches my dilemma. I hope someone here will recognize the issue. Thanks in advance.
Computer:
i5 chip running Windows 7 64-bit
SSD and HDD
6 GB RAM
Sapphire Radeon HD 5450 graphics card
3 monitors -- one HDMI, one DVI, one VGA
I am running three monitors on this for basic office use (no gaming or heavy graphics apps). The DVI and HDMI monitors are connected to the card. The VGA monitor is still connected to the mobo.
On Friday I replaced the VGA cable with a longer one for more desktop flexibility. All three monitors blinked but went back to the way they were after a couple seconds. I shut the machine down (I thought) on Friday. When I got in today, there was a black screen saying I needed to hit enter to complete the changes. Having no real choice, I did and Windows started as usual. But only the HDMI and DVI monitors came on.
The graphics card is a light-duty item and I knew it couldn't drive all three monitors in legacy mode. When I plug the VGA monitor into the card, that works but the DVI is disabled. The screen resolution tool shows either/or.
I could be way off base but I think the black screen changes disabled the Intel graphics display adapter that comes with the chip leaving only the Catalyst. The AMD is the only one showing now in Device Manager so I can't swear that was it. I had what could be a related problem getting rid of the Intel graphics on an AMD install with my home computer but that has only gotten me so far with this problem.
Long story short, if anyone recognizes the issue or has a better idea for running three monitors on this box, I'd love to hear them. Thanks, sh