DVI stopped working, VGA still works

azriul

Honorable
Feb 7, 2013
60
0
10,630
I made a thread here

http://www.tomshardware.co.uk/answers/id-2289292/black-screen-sound-dvi-stopped-working-vga-works.html

But no one answered, I don't know if I gave too much details so people just "TLDR" or if it was in the wrong section, but I am at a complete loss and need to get my system up ASAP.

Short story is, was playing a game (which has many hours in it) screen went blank and had no signal. After much testing with the monitor, leads, numerous graphics cards the conclusion is that DVI has just stopped working on my main computer (old crappy one the screen works using DVI so it's not that). The problem is, I'm stuck using a poor GPU since it has the VGA slot while my new GPU only has a DVI and HDMI (HDMI works on the TV but monitor doesn't have HDMI)

Much more details in the other thread, but perhaps too much.

Win 7
i5
8 gb ram
HD 7750 GPU (also tested with a 6670)
 
Solution
Have a look at the box for your HD 6670, or lookup the model on the manufacturers web site. I bet it has a DVI-D output, not a DVI-I. The VGA output works doesn't it? Why would you have ever tried to use a DVI to VGA adapter if there is a VGA output?

I don't know how a VGA to HDMI adapter is possible. VGA is analog and HDMI is digital. This would have to be an active device with power.

Use the DVI input on your monitor.
If you can access a DVI to DVI cable, test this first.
If not, buy a HDMI to DVI cable since we know the HDMI output works.
You have probably confused people with your use of the word slot. PCI-E, PCI, APG and ISA are slots, not HDMI, DVI or VGA.
From everything you have described, it sounds like the DVI output on your graphics card is faulty.
If the card is under warranty, get it replaced.
If not, you are up for a new card.
 
I've tried to read your post again. The trick is trying to pick tests you have run from the conclusions you have drawn.
Your new card does not have a DSUB connector (what you are calling VGA) because it has a DVI-I connector. With the right cable or adapter this DVI-I connection can output analog video for the DSUB input of a monitor and digital video for the DVI-D input of a monitor.

If you think the DVI output of the card is OK and the DVI input of the monitor is OK, that only leaves the cable.
 

azriul

Honorable
Feb 7, 2013
60
0
10,630
No, it's not the cable. This is why I put so much info in the last thread yet too much info seems to put people off.

Computer has had the GPU in for a year this month. Screen goes black while playing a game and says no signal.

I try a different card with the VGA slot, it works. I try the SAME card with the DVI adapter and it doesn't work. It's the SAME lead just with an adapter on it. I have two extra adapters, none worked. Since it doesn't work on either of the cards yet the new card still works with HDMI on the TV and the older card works with VGA on the computer, it's not the card.

I used the DVI adapter with the monitor on my OLD irrelevant computer. It works. So it's not the cable, adapter or monitor.

I have updated drivers, just to make sure. No issue.

I have run intergrated graphics in VGA, works fine.

I have cleaned the dust from everything except the PSU.

Nothing has changed on my computer for quite some time. Nothing new has been installed, I haven't downloaded anything, or done any windows updates.

Heat is fine, 37d on the GPU, 49 on the CPU.

Done a virus scan, malware scan, used CCleaner, junkware remover.

Tested with 2 computers, 3 cards in total (2 in newer computer, 1 in old) 2 leads, 3 DVI adapters. Screen went blank a month ago but after a restart it was fine so I forgot about it. Screen went blank a couple days ago and slowly progressed from going blank 15 minutes into windows, to boot up to windows splash screen to even in the bios and now instantly when turning on. Also had times when the computer would turn on and then turn itself right off again, while turning itself back on a few seconds later (with the DVI plugged in)

Happened in a game as I said, which has had many hours running on that card.

It's not the card, cables, monitor or adapters. I don't have windows to reinstall. I don't mind buying something to replace but I need to know what that something is. I want to think it's the PSU since it's only a 450w so I will need to replace that at some point anyway but I can't afford to just buy one thing and hope for the best only to need to buy another thing and another thing after. I really don't want it to be the mobo and I don't know why it would be but the tech guy in the shop said he never heard of this and it could bethe mobo.
 

azriul

Honorable
Feb 7, 2013
60
0
10,630


Really don't know what you're trying for. It is not one of those.

I use the VGA monitor cable with a DVI-i Adapter. It is NOT the problem. Why? As I said, the very same cable with the very same monitor on the same resolution runs on my old computer. It does not run _anymore_ on my newer computer. It has been working since, forever.

Regardless of that, I have tried a different cable and two different adapters before even making the post.

So, not the cable, adapter or monitor.

The GPU works on HDMI, so the card itself isn't faulty, but perhaps it could be the DVI? Nope since putting in a different GPU (which also use to work with the DVI) doesn't work, except, it does with the VGA and also HDMI.
 


I don't know what else to say. There are three components involved, the graphics card, cable and monitor. There is nothing else. One of these must be your issue.

Check your testing.
Note that a DVI-I connector on the motherboard can be connected with adapters to a DSUB input (VGA) or a DVI-D input, but these are completely different video outputs).
All connectors I mention are on the graphics card. You can't use the motherboard outputs if using a discrete graphics card.

Test DVI out from the old computer to the monitor DVI-D input, if this works then the monitor and cable are good.
Use the same monitor input and cable to connect to your good computer on the DVI output of the graphics card.
If this doesn't work, it looks like the DVI output of this card.
To prove this, plug the same output and cable to the DVI input of another monitor. Using an adapter and plugging into a DSUB input is not a valid test.
 

azriul

Honorable
Feb 7, 2013
60
0
10,630


As I have said, numerous times now.

I've tested the monitor, it's cable and the adapter on a completely different computer. It worked. So it's obviously not the monitor, it's cable or the adapter.

So that would leave the GPU, and as I've said, numerous times

The 7750 which has been in there almost a year, just decided to stop working through the DVI, yet still works through the HDMI. So, I CHANGED the card to a completely different one, which also stopped working through the DVI yet works through the VGA. Are you trying to imply that both cards, which has been working perfectly just decided to break on the DVI-I port on the same day, at the same time, while both working through other means?

 


I'm just trying to understand your problem. If you had a solution, you wouldn't be posting this. I can see you are getting frustrated, but you need to help me understand your issue if you want an answer. You seem to be saying you have ruled everything out, but yet still have an issue.

You mention an adapter. If you are connecting the DVI output of these two cards to the DVI input of a monitor, what is the adapter for?
 

azriul

Honorable
Feb 7, 2013
60
0
10,630


Okay, I'm going to start with something, please do not delve into it else we will be here forever, but I was forced (as in, no other option at the time) to buy a pre-built computer. Not something I have ever done before and have always made my own for the last 25 or however many years I've had one. From the start it gave blue screens now and then (maybe once or twice a month) nothing too much but it lead me to believe the GPU wasn't getting enough voltage (STOP x101). I don't want to get into the ins and outs as to why I bought a pre-built, I regretted it before I even did it and even more so once I had done it but anyway, I later bought the 7750 (better than the 6670 in there but also a lot less power needed) the blue screens haven't happened though sometimes it will freeze (after playing a game usually, but again, this is not common but I mention it because it JUST happened)

Something that I've needed to get for some time is a new PSU so I can properly upgrade it all, however since this has happened it's now halted everything because I'm going to need to fix this first.

These aren't the exact things I have, I have better quality ones but as an example

The VGA cable going from the monitor to the computer

http://www.amazon.co.uk/Monitor-Replacement-SVGA-Cable-Black/dp/B001MQGOLU/ref=sr_1_1?ie=UTF8&qid=1410175661&sr=8-1&keywords=vga+cable

The adapter (Actually, have 3 different ones)

http://www.amazon.co.uk/Monitor-Display-Adapter-Genuine-CPO/dp/B0021YGUFM/ref=sr_1_1?ie=UTF8&qid=1410175716&sr=8-1&keywords=dvi-i+adapter

The monitor is an AoC2236 (Can't find the exact version I think it's an AoC2236Swa but the site has no drivers for this)

The radeon HD 7750 has only a DVI-i input, a DP input and an HDMI.

The radeon HD 6670 has a DVI-input, VGA input and HDMI

In my old computer I have a Radeon HD 5750 in it which has a DVI-i and HDMI.

Playing boarderlands 2 which has around 60 hours of game time, the monitor went off with the 7750 (DVI connected) inside it. I restarted the machine, it loaded up and 15 minutes in the black screen appeared. I believe it just said no signal (as though the frequency of the GPU is incorrect)

Obviously, due to the black screen I needed to hold down the power while it was still on, which is never really a good thing but once or twice doesn't usually cause too many issues. Restarting again it got to the windows splash screen and went black. Continued to stay black every time I turned it on.

I removed the card and went to clean it, check everything is seated correctly etc etc, and tried again. Completely black.

I swapped over the card to the 6670 with the VGA input and it worked.

I swapped back to the 7750 and tried the HDMI and it worked. However, this is not practical to use for me due to the fact I needed a 3m HDMI cable to do it.

I swapped back over yet again to the 6670 and saw it had a DVI input, I tried this, and it didn't work.

So now, two GPU's which have both worked for a great length of time, just suddenly stopped working on the DVI, yet both work with other means.

I unplug my monitor, switch it over to my old computer and test the exact same thing as what's not working on my newer computer - the monitor with the VGA cable and DVI adapter that's worked for years, plugged it into the 5750 DVI input and, it worked. Ruling out the fact it's the cable, adapter or monitor.

Also, somewhere in between that lot, I got a different on screen message saying "input not supported" which spammed up the monitor (and doesn't happen now so I can't replicate it)

Somewhere in the mix of all that, I tested another cable anyway, along with the other adapters, with the same results.

I've also run into safe mode, uninstalled all the drivers (both GPU and monitor) and installed the latest (but for the older 6670 since I have no means to use the 7750 easily due to TV location)

All temps are completely fine (Except I do fear for the PSU but have no way of testing it, and makes me want to think it can't draw enough power for the DVI but that doesn't really make much sense)

I don't have a copy of windows 7 (again, more joys to pre-built computers, but again without going into detail I didn't have any choice at that time)

So while I was planning on buying a new PSU it's halted me because I'm likely going to need to replace something. If the GPU then I'll need a PSU anyway unless I get another 7750, but, since both cards stopped working on the DVI but both work on everything else, it's unlikely those. The one thing I am dreading the most is it being the mobo, but again it doesn't make much sense as to why it would be that.
 
It won't be the motherboard, CPU or power supply. You have proven this by testing with another video card.
It seems like you must be using the DSUB input (VGA) on your monitor, since the only cable you have listed is a VGA cable.
This means the graphics card is sending an analog video signal from the DVI-I output and then you are using an adapter.
The monitor is a digital device and has to convert this analog signal to digital.

Even if you weren't having problems, you would be much better using a digital video signal as this is natively supported by your monitor.
To do this, simply use a DVI to DVI cable and no adapters.

You mention testing the HDMI output from this graphics card, but your monitor does not have a HDMI input. Were you using a HDMI to DVI adapter? This is still reasonable if you were as the HDMI output can send a digital video signal compatible with DVI. Why was this not a practical solution?

This is what I believe your issue is:
The circuitry in your HD 7750 that produces an analog video signal for the DVI-I port has died. A DVI to DVI cable on the same DVI-I output may work fine.
The HD 6670 has outputs for DVI-D, DSUB and HDMI. While DVI-I supports analog or digital video, DVI-D supports only digital video. A DVI to VGA adapter will not work with a DVI-D output or input. This won't ever have worked. If you want analog video from this card, you need to connect to the DSUB (VGA) output.

If you can get a DVI to DVI cable to test digital video from the DVI-I output of the HD 7750, that is probably best.
If not, and if you have already tested the HDMI output of this card, a HDMI to DVI adapter and DVI to DVI cable, or just a HDMI to DVI cable may be your best bet.
 

azriul

Honorable
Feb 7, 2013
60
0
10,630


No, I tested the HDMI on the 7750 with my TV, not the monitor, with a standard HDMI cable.

I have ALWAYS used that adapter, and it's the exact same input on the 6670 as I would have likely used that when I first had the card

There is only one thing left I can try which is put the 7750 in my old computer, but that's not connected to the internet so changing the drivers will be a hassle. I've also bought a VGA - HDMI adapter so I can try that, I was just hoping it would have come today.
 
Have a look at the box for your HD 6670, or lookup the model on the manufacturers web site. I bet it has a DVI-D output, not a DVI-I. The VGA output works doesn't it? Why would you have ever tried to use a DVI to VGA adapter if there is a VGA output?

I don't know how a VGA to HDMI adapter is possible. VGA is analog and HDMI is digital. This would have to be an active device with power.

Use the DVI input on your monitor.
If you can access a DVI to DVI cable, test this first.
If not, buy a HDMI to DVI cable since we know the HDMI output works.
 
Solution

azriul

Honorable
Feb 7, 2013
60
0
10,630



Whoops, I didn't mean to hit pick as a solution, how do i cancel it since it's not solved :p

The VGA to HDMI is possible because I just bought an adapter for it.

The reason I would have used the DVI adapter is because when I got the computer I used the same monitor from my old computer, and I'm lazy so I don't usually check things like that and would have just pulled it out the old and plugged into the new, I doubt I would have noticed there was a VGA input. I don't have the box as i said, it was a prebuilt machine so you don't get any of that.

There is no DVI input on my monitor, there is ONLY a VGA input and a USB (and something else I'm not sure what it is, it looks like an ethernet port but I don't know why it would have it)

I've done it all, many times, before even opening the thread testing different cards, leads, computers where it all works except the DVI on the GPU's (plural)
 
Here is a link for a Gigabyte HD 6670:
http://www.gigabyte.com.au/products/product-page.aspx?pid=3832#sp

Supported outputs:
HDMI * 1
DVI-D * 1
D-sub * 1

I'm sure you used the same cable, but it must have been plugged in the VGA output, not using a DVI to VGA adapter.
In any case, this is just around explaining why the DVI output on the HD 6670 isn't working. You couldn't believe that the DVI output on two cards could have failed at the same time, and they didn't. I believe the DVI output on the HD 6670 will work fine if you connect it to a DVI monitor input.

Is this not your monitor? http://us.aoc.com/monitor_displays/2236vw

I can see HDMI to VGA adapters online. They appear to have a small device with the VGA connection and any circuitry. The power must come from the HDMI source. If your monitor really doesn't support DVI, this is probably your best option.

 

azriul

Honorable
Feb 7, 2013
60
0
10,630


Nope, it is DEFINITELY a DVI-I input since that's my adapter and it wouldn't even fit in a DVI-D one since it has 4 prongs which wouldn't be able to go into the dvi-d (which, is the problem I'm having trying to test it on the intergrated)

You also linked the wrong card, mine only goes up to 800hz

http://www.expertreviews.co.uk/graphics-cards/1284373/amd-radeon-hd-6670/specifications

http://www.amd.com/en-gb/products/graphics/desktop/6000/6670

"Dual-link DVI with HDCP
Max resolution: 2560x1600
VGA
Max resolution: 2048x1536"

So you can see what the DVI-D is

http://www.playtool.com/pages/dvicompat/dvi.html

Now you can see how the DVI-i wouldn't be able to fit in there, which, it does.

and no, that is not my monitor. I already stated something along those lines earlier too. I say this because when I was looking up the manual (way before any of this) I noticed it had way too many inputs.

http://www.manualslib.com/manual/204740/Aoc-2236vwa.html?page=13#manual

Pretty sure I told you that it ONLY has a VGA slot, a USB and what looks like an ethernet. I also think I said it was a 2236swa or something
 


That is still a DVI-D output. They use a DVI-I connector to support cables that may have these pins, but it won't support analog video.
What possible other explanation do you have for why it doesn't work?

Here is a link for the Asus card with 800 MHz clock speed:
http://www.asus.com/Graphics_Cards/HD66702GD3/specifications/

Listed interfaces:
D-Sub Output : Yes x 1
DVI Output : Yes x 1 (DVI-D)
HDMI Output : Yes x 1
HDCP Support : Yes

 

azriul

Honorable
Feb 7, 2013
60
0
10,630


sigh, and once again no. Ive now plugged them both into the old computer and both work. So, yet again, the GPU, monitor, cable and adapter are all fine.

What other explanation do I have? I don't know obviously, that's why I posted this thread in the first place.
 

azriul

Honorable
Feb 7, 2013
60
0
10,630


done repeating myself.
 


Your sentence "Ive now plugged them both into the old computer and both work." is not clear because it is not clear what "them" is or how you define "work".
Much of the confusion on this post is because you failed to specify originally that you were using a DVI to VGA adapter rather than digital video from a DVI connector.

Maybe you have some obscure HD 6670 card with a DVI-I output, unlike the plethora of cards with DVI-D only and pretty much every other card manufactured that had both a DVI and D-SUB connector.
Maybe there is some issue that affects "DVI" from this computer, even though you have apparently ruled out any of the hardware as being the issue.

I've tried to help you, but some people just don't want to be helped.