Computer Won't Detect Monitor (but works fine with TV)

Status
Not open for further replies.

NaddaTroll

Honorable
Aug 8, 2012
3
0
10,510
So I have been having this issue for the last six months, and I'm finally at my wit's end.

Six months ago, my computer and monitor worked fine together. One night, though, I turned on the computer and the monitor stayed dark. It had power--it had even been working fine earlier that day--and all the cords were hooked up correctly and securely (DVI-DVI). Worried that my graphics card had burned out, I plugged the computer into my TV (HDMI-HDMI) and everything worked fine.

Then I tried my monitor with the computer with another DVI-DVI cable. Still didn't work. Finally I tried the monitor with another computer (DVI-DVI) and that worked. I decided that my video card's DVI slot was fried and went on with my life via HDMI on the TV.

Yesterday I bought an HDMI-DVI cable and plugged my monitor into the HDMI slot on my computer (which works fine on my TV. Still nothing. The cable DOES work at connecting the monitor to my game consoles, though, so it's a fine cable.

I've tried the following:

Update video drivers
Reset the video card in the mobo
Holding down the power button on the monitor to "reset" it
Multiple cables
Every port on the video card has been plugged in to
Resetting all the settings in Catalyst Control Center

My Google-fu is usually sufficient for any computer issues I've had, but I can't find any instances like mine. And the ones that are similar to mine don't have solutions that work. Is it possible that my computer just randomly decided that it doesn't like the monitor, even though that monitor works with everything else?

Other possible relevant information:

Radeon 6890
Samsung 2253BW 21" (I think) monitor with 1680 x 1050 native resolution
Sony Bravia 46" TV with 1920 x 1080 native resolution
Windows 7 Ultimate

Any ideas? Any help would be ridiculously appreciated.
 
I would see if you have a local micro center or see if a friend has a laptop. Im thinking your monitor backlight could have burnt out or the control board in the monitor went bad. to test your system you want to put the monitor on a know working system. if it works fine. then you want to put a monitor onto the video card and see if you get the same errors. if you do then if a friend has a spare cheap video card you want to drop the card in and see if the error still there or it gone. if it a back light issue..if you have a flahs light shining it on a dead screen you may be able to light the lcd/led up to read it.
 

NaddaTroll

Honorable
Aug 8, 2012
3
0
10,510
I know that the monitor works because it works with my xbox. And, as I said, I know that the computer is in working order because it works perfectly with my tv.

Update: I just tried booting into safe mode and then switching my tv out for the monitor. I was able to get an image on the monitor this way, but when I tried raising the resolution, the monitor started acting like it was receiving no signal again. That led me to believe it couldn't display higher resolutions, but it displays the xbox at 1080. Since it stopped working in safe mode, though, I have a hard time believing it is software related...
 

NaddaTroll

Honorable
Aug 8, 2012
3
0
10,510
So I still haven't been able to get image to appear on the monitor when it's plugged in to my computer. I have, however, been able to get image on every single other machine I've plugged it into. Likewise, the computer has been able to display image on every other TV/monitor...

Any other ideas?
 
the monitor max video setting per the doc is 1680 x 1050 @ 60 Hz
Input. if your using a vga adaptor or another adaptor it might be blocking the signal for the monitor. the other issue could be the video card driver trying to run the monitor in hd mode. i would go into the video control panel and make sure it using both wide screen format and standard computer output.
 
G

Guest

Guest


hi there mate.

I'm glad I've been reading this. Within past 2 days I have exactly the same problem. My computer was working fine on my samsung HD monitor. Been using it for about a month with no problems. 2 days ago I unplugged it from the back because I was reorganising my desk and when I plugged the computer back in. Bam! The computer does no longer display on the monitor. But I put it on my TV via HD and works fine. My Xbox works on the monitor via VGA and HD.

Now I've tried using an old pci express graphics card which made no difference. I've tried using an DVI to HD cable which still doesn't make a difference. I've reseted the bios which done nothing. Uninstalled drivers still no success.

Strange thing is, when I have it working on the TV, I don't see the bios loading up nor do i see the Windows 7 loading screen. Just stays blank until it reaches the Windows login screen.

My pc spec:
Amd phenom x4 3.4ghz
8gb Corsia vengeance ddr3
Gigabyte ud2 mobo, 990fx chipset
Xfx 7950 double d black edition 3gb graphics card
500gb sata HDD
700W ocz module PSU

If anyone has any suggestions on how to cure this then it will be most appreciated! And naddatroll if you have a solution be great to hear from you.

Aaron
 

dfvx990mq321pl

Honorable
Jun 30, 2013
1
0
10,510
I know this is nearly a year old topic but I'd like to post that I was also experiencing the same issue and got it fixed. Computer used to work with monitor, computer stopped working with monitor, computer still worked with other displays and monitor still worked with other devices but the computer and monitor just wouldn't work together.

What fixed my problem was doing a clean uninstall and reinstall of the graphics drivers in safe mode. Hope this will help someone.
 
Status
Not open for further replies.