Very specific problem with Nvidia Geforce 550 Ti

nmderush

Honorable
Aug 22, 2013
8
0
10,510
This might be a little over-thorough, but I feel like I've covered every base and could use help with a confirmation.


I have a EVGA Geforce GTX 550 Ti installed into my system (specs to follow below) - which I built about a year ago.

After buying a 51" Samsung plasma screen TV (link), I hooked my video card up to it via a mini-hdmi converter to connect to it. Everything was fantastic for probably 6 months or so, then one day, I just lost the signal via HDMI.

I have no clue why this happened in the first place - I suspected the fraying mini HDMI adapter or 20' long cable I ran across my living room for it.


A few months passed, I moved my TV next to my PC, used a brand new adapter, HD cable, and connected to no avail. Strangely enough, my onboard Intel Graphics HD out works just fine, however I can't play any of the video games I was once able to.


Thus far I have:

Power cycled both the PC and TV
Physically removed the video card, removed all drivers and software, booted with onboard graphics enabled, shut down, re-installed hardware and software
Entered the television's service mode to attempt to reset or re-initiate EDID handshakes (not sure if that worked)
Connected the HDMI out to another Samsung TV (still no signal)


The weirdest part of all of this, is that when I connect the cable to the television, the sound confirmation on my computer acknowledges the connection, the previously greyed out and unselectable input on the television activates, and both the Nvidia Control Panel and windows 'Screen Resolution' see the display. Also, the control panel says that the Samsung TV is HDCP ready.

I am transmitting within the frequencies and parameters of the television (native 1024x768, 60hz) and have attempted to lower those parameters to force an image.


Basically, to me it appears that my video card just crapped out on me, specifically the HDMI output. Both DVI outputs work fine with my dual monitor setup. Does this all sound correct? I am trying to make sure I don't throw my money away buying a new GPU when I'd rather wait.


Thank you for reading and any insight!



SPECS:

Operating System: Windows 7 Ultimate 64-bit (6.1, Build 7600)
BIOS: BIOS Date: 07/13/12 22:41:39 Ver: 04.06.05
Processor: Intel(R) Core(TM) i5-3570K CPU @ 3.40GHz (4 CPUs), ~3.4GHz
Memory: 8192MB RAM
Available OS Memory: 8088MB RAM
Page File: 3902MB used, 12270MB available
DirectX Version: DirectX 11
 
Solution
If you've tried the HDMI output on your graphics card hooked up to a smaller screen, with a different cable and it still doesn't work, yet the DVI outputs do, this all points to one thing -> HDMI output is nackered. If it's still under warranty, RMA it.
If you've tried the HDMI output on your graphics card hooked up to a smaller screen, with a different cable and it still doesn't work, yet the DVI outputs do, this all points to one thing -> HDMI output is nackered. If it's still under warranty, RMA it.
 
Solution




Yep, it was a smaller screen actually - a 32" LCD Samsung, different cable and all. The DVI outputs are putting a signal to the monitors I use, but the television doesn't support that format (nor VGA).

I also have a home theater receiver attached to another HDMI input on the 51" - it and all of the HDMI inputs run through it (cable box, Xbox 360) work fine.

I figured it was just a short that went bad or something (how else can you explain a sudden, permanent loss of signal?), so I'll have to begin looking at inexpensive cards.

Again, thank you for the feedback :)
 
Does anyone know if it would be a good idea (if it is in fact a card fault) to get another 550 Ti and run them SLI (my motherboard allows for this, yet I've never done it)? Can I use the outputs of one card and still get the power of the other? Or are middle of the road cards far enough along now that I might as well just buy a newer model?