Can't get signal from DVI to LCD TV! TRIED FOR 2 MONTHS!

Status
Not open for further replies.

BoogeyMan1

Distinguished
Feb 11, 2008
13
0
18,510
0
I have a 32 in LCD TV (that is my primary monitor)

No matter what I've tried...I cant get it to accept a signal from ANYTHING other than a VGA out. Can't use a DVI adapter. Guys...trust me...I've tried everything...dvi/vga adapters, different cards...different drivers...nothing. Low resolutions.. I see boot up and windows logo...then "No input" from screen as soon as windows kicks in. It works before I install drivers...once geforce drivers (Latest ones)are installed..."No input" I've tried two older drivers...same problem
I'm currently running 1366x768 custom using the VGA out from card. Works fine, but I'd love to fix this so I can move to a better card (8800 gt?). If I plug it into DVI out, screen goes black at windows start. I can see boot process fine. Once again, It works before I install drivers ...boots into windows at 1024x768
I install drivers...blam...same problem. I even purchased an ATI card...wiped NVIDIA Drivers off system, install latest ATI drivers......Boot up, and...and...NO SIGNAL. Same thing.

If you want to see everything I've tried, check this thread: http://forums.guru3d.com/showthread.php?p=2602304#post2602304

I'm currently using a evga 7600 GT. TV is a Scott (Akai) LCT32SHA
(Windows sees it as "Plug N Play" Monitor)
NVIDIA control panel recognizes it as LCT32SHA correctly. Can anyone tell me what the deal is?
I've been trying to get this working for three weeks.
I really apreciate ANY help to get a DVI to work with this TV/ Monitor. THANKS!
 

TeraMedia

Distinguished
Jan 26, 2006
904
0
18,990
3
There are one or two "implied" resolutions for DVI. In other words, the DVI spec stipulates that if you are connected via DVI, then the display device must support the implied resolutions. Check wikipedia, and then the official DVI spec site for details.

If your LCD doesn't support one of those, and it happens to be what the drivers default to when they start up in windows for the first time, then that could be the problem.

If you have another display device to connect to, I'd recommend connecting to that, setting a resolution you know your LCD will accept, and then switching over. Or, you could boot up in VGA mode and force 640 x 480 display if your LCD can display that (which it might, since it shows the boot sequence). Once in, you could use regedit to set / modify the default screen res in the registry (no, I don't know where it is... but I have a hunch it's in there somewhere), and then try doing a normal boot.

Best of luck.
 

badgtx1969

Distinguished
Jul 11, 2007
263
0
18,780
0
Just some clarifications:

Post your system specs (motherboard, ram, OS, etc).

It works fine VGA-VGA, but not VGA-DVI or DVI-DVI?

Does every combination of output-input work in safe mode?

How did you do your driver removal? Hopefully you used a good driver removal tool.

Have you done an OS reinstall (XP 32 bit?), it may be easier just to start from scratch again and work up from there. Also make sure your chipset drivers are up to date and the MOBO BIOS is correctly configured.

 

badgtx1969

Distinguished
Jul 11, 2007
263
0
18,780
0


Very true, this is why an LCD TV is not technically a monitor. Have you tried a LCD monitor that supports the native DVI resolutions?
 

BoogeyMan1

Distinguished
Feb 11, 2008
13
0
18,510
0
This maybe the tell, Bad GTX...
...if I boot to a standard CRT monitor from DVI out, boots fine...looks great. I then swap cable to my 32 LCD...it works fine at any resolution.

That tell you guys how I might fix this without cable swapping every time I boot up?
 

johnnyq1233

Distinguished
Aug 15, 2007
1,234
0
19,460
98
I'm not 100% sure on this but my 7950GT/OC only came with DVI-Component adapter. As far as I know these cards did not have the HDMI capability.
If your tv has component you can run that way and still get the 720p that the tv is capable of.
Hope this helps.
 

rtfm

Distinguished
Feb 21, 2007
526
0
18,980
0
On the nvidia control panel on display -> set up multiple displays there is a question mark with "my display is not shown in the list" clicking on that brings up a window with "force television detection at startup" tick box. Have you tried that?
 

BoogeyMan1

Distinguished
Feb 11, 2008
13
0
18,510
0
OOOhh No Haven't see n that oh BOY!
Wish me luck...here goes....


...nope....didn't work. Had to even boot into safe and remove drivers to boot up again. :)
 

roadrunner197069

Splendid
Sep 3, 2007
4,416
0
22,780
0
You can go DVI to HDMI if your HDMI is video only. Most HDMI is set for video and audio, if this is the case your speakers will squeel because there wont be sound.

DVI is a digital signal and your VGA is analog so it wont work.

2 options for you.
1. Use DVI to HDMI if your HDMI in is Video only.
2. Get a card that has built in s/pdif so you can do DVI to HDMI with sound. This card or one like it would work. http://www.newegg.com/Product/Product.aspx?Item=N82E16814500006
 

niz

Distinguished
Feb 5, 2003
900
0
18,980
0
> This maybe the tell, Bad GTX...
> ...if I boot to a standard CRT monitor from DVI out, boots fine...looks great. I then > swap cable to my 32 LCD...it works fine at any resolution.

Normally when a computer powers up, the video card sends a message on the DDC channel of each of its video connectors for any connected monitor to send EDID data back to the video card, which contains data that tells the video card what the specifications of the display are. The EDID includes manufacturer name, product type, phosphor or filter type, timings supported by the display, display size, luminance data and (for digital displays only) pixel mapping data.

When you power up, the video card needs to know which connector(s) to output video on, so it if it doesn't detect any EDID data coming in on the DVI connector it assumes there's nothing plugged into it so doesn't enable output on it. Most video cards in this case fallback to only sending output to the VGA connector so that monitors older than about 1994 that don't do EDID/DDC at all will still get a signal, as they are also all older than the DVI standard so must be connected via VGA.

Your post above strongly indicates that your TV is not doing EDID which is perhaps not surprising as its a computer monitor standard not a TV standard. Some older TV's even with a DVI connector don't do EDID. It may also just be that the DVI cable to your TV doesn't connect pins 6 and 7 which are the DDC clock and data line that the EDID data exchange happens on, so I'm guessing either you have a bad or cheap cable or your TV is old enough to not do EDID.

The reason your solution works if you boot up with another computer monitor, not your TV, then switch over, is that the video card is getting that initial EDID data from the monitor, that tells it that something is plugged in to the DVI connector so enables the signal output on it. Later on you can then plug in your TV and get the signal because the video card happens to not notice/care that you've diconnected the monitor and plugged in something else. It still thinks its sending image to the monitor when in fact its now going to the TV.
 

BoogeyMan1

Distinguished
Feb 11, 2008
13
0
18,510
0
Intersting.
Well the LCD TV I'm using is current (still being sold). The EDID/DDC problem might be the cause, though.
So there is no work around this, I'm guessing? I'm really not willing to keep a CRT next to my desk to swap cables every time I boot. With me limited to VGA out cards only, Looks like the fastest card I can use will be a 8600 GT. I've tried SLI with my current 7600GT...it was a no go too. Sigh.

Well thx for all the help guys.
If anyone know a faster card than an 8600gt (w/ VGA out), let me know
 

niz

Distinguished
Feb 5, 2003
900
0
18,980
0
If you're not using the same cable to connect your PC to your monitor as to connect your PC to your TV, try connecting your TV with the monitor cable as we know that cable works already. It may be the cable you're using to connect your TV is fubar for EDID.
 

fletch420

Distinguished
Mar 23, 2007
141
0
18,680
0
I had this issues Q1 last year. I will respond after only reading the first few posts, so sorry if I reoeat others comments.
1. get your TV manual out....and look at the accepted signals at each input.
2. set your PC up on a regular monitor
3. set your PC to the resolution of the desired input for the TV
turn off PC and swap the cable over to the TV.

This should solve this
also understand that lower end tv's have sometimes very limited input resolutions (not implying your is I am not sure) and therefore can be tricky.

good luck
 

TeraMedia

Distinguished
Jan 26, 2006
904
0
18,990
3
niz may be on to something. There is a feature in CCC to not read the EDID settings. Under Monitor Properties -> Attributes, there is a Monitor Attributes section with a check box "Use Extended Display Identification Data (EDID) or driver defaults". Below that is a pull-down for maximum resolution and maximum refresh rate. I suspect that if you uncheck this box (or the equivalent of this box in the NVidia world), you may be able to use your LCD without hassle. You'd have to set it using your other monitor, but I imagine that once it's set, you may be ok.
 

bliq

Distinguished
Jun 29, 2006
1,591
0
20,160
152
There are also some TVs that were EDTV and not HDTV. They would be 480p, which is the same resolution as DVD. I'm not sure what the right resolution would be for 480p- 1080x480? I don't know. make sure your tv isn't edtv.

also some "720p" hdtvs were 1024x768 even though they are widescreen. There are a lot of plasmas like this. 1368x720 would be out of spec for such tvs.
 
Status
Not open for further replies.

Similar threads