This might be a little over-thorough, but I feel like I've covered every base and could use help with a confirmation.
I have a EVGA Geforce GTX 550 Ti installed into my system (specs to follow below) - which I built about a year ago.
After buying a 51" Samsung plasma screen TV (link), I hooked my video card up to it via a mini-hdmi converter to connect to it. Everything was fantastic for probably 6 months or so, then one day, I just lost the signal via HDMI.
I have no clue why this happened in the first place - I suspected the fraying mini HDMI adapter or 20' long cable I ran across my living room for it.
A few months passed, I moved my TV next to my PC, used a brand new adapter, HD cable, and connected to no avail. Strangely enough, my onboard Intel Graphics HD out works just fine, however I can't play any of the video games I was once able to.
Thus far I have:
Power cycled both the PC and TV
Physically removed the video card, removed all drivers and software, booted with onboard graphics enabled, shut down, re-installed hardware and software
Entered the television's service mode to attempt to reset or re-initiate EDID handshakes (not sure if that worked)
Connected the HDMI out to another Samsung TV (still no signal)
The weirdest part of all of this, is that when I connect the cable to the television, the sound confirmation on my computer acknowledges the connection, the previously greyed out and unselectable input on the television activates, and both the Nvidia Control Panel and windows 'Screen Resolution' see the display. Also, the control panel says that the Samsung TV is HDCP ready.
I am transmitting within the frequencies and parameters of the television (native 1024x768, 60hz) and have attempted to lower those parameters to force an image.
Basically, to me it appears that my video card just crapped out on me, specifically the HDMI output. Both DVI outputs work fine with my dual monitor setup. Does this all sound correct? I am trying to make sure I don't throw my money away buying a new GPU when I'd rather wait.
Thank you for reading and any insight!
SPECS:
Operating System: Windows 7 Ultimate 64-bit (6.1, Build 7600)
BIOS: BIOS Date: 07/13/12 22:41:39 Ver: 04.06.05
Processor: Intel(R) Core(TM) i5-3570K CPU @ 3.40GHz (4 CPUs), ~3.4GHz
Memory: 8192MB RAM
Available OS Memory: 8088MB RAM
Page File: 3902MB used, 12270MB available
DirectX Version: DirectX 11
I have a EVGA Geforce GTX 550 Ti installed into my system (specs to follow below) - which I built about a year ago.
After buying a 51" Samsung plasma screen TV (link), I hooked my video card up to it via a mini-hdmi converter to connect to it. Everything was fantastic for probably 6 months or so, then one day, I just lost the signal via HDMI.
I have no clue why this happened in the first place - I suspected the fraying mini HDMI adapter or 20' long cable I ran across my living room for it.
A few months passed, I moved my TV next to my PC, used a brand new adapter, HD cable, and connected to no avail. Strangely enough, my onboard Intel Graphics HD out works just fine, however I can't play any of the video games I was once able to.
Thus far I have:
Power cycled both the PC and TV
Physically removed the video card, removed all drivers and software, booted with onboard graphics enabled, shut down, re-installed hardware and software
Entered the television's service mode to attempt to reset or re-initiate EDID handshakes (not sure if that worked)
Connected the HDMI out to another Samsung TV (still no signal)
The weirdest part of all of this, is that when I connect the cable to the television, the sound confirmation on my computer acknowledges the connection, the previously greyed out and unselectable input on the television activates, and both the Nvidia Control Panel and windows 'Screen Resolution' see the display. Also, the control panel says that the Samsung TV is HDCP ready.
I am transmitting within the frequencies and parameters of the television (native 1024x768, 60hz) and have attempted to lower those parameters to force an image.
Basically, to me it appears that my video card just crapped out on me, specifically the HDMI output. Both DVI outputs work fine with my dual monitor setup. Does this all sound correct? I am trying to make sure I don't throw my money away buying a new GPU when I'd rather wait.
Thank you for reading and any insight!
SPECS:
Operating System: Windows 7 Ultimate 64-bit (6.1, Build 7600)
BIOS: BIOS Date: 07/13/12 22:41:39 Ver: 04.06.05
Processor: Intel(R) Core(TM) i5-3570K CPU @ 3.40GHz (4 CPUs), ~3.4GHz
Memory: 8192MB RAM
Available OS Memory: 8088MB RAM
Page File: 3902MB used, 12270MB available
DirectX Version: DirectX 11