• Now's your chance win big! Join our community and get entered to win a RTX 2060 GPU, plus more! Join here.

    Meet Stan Dmitriev of SurrogateTV on the Pi Cast TODAY! The show is live August 11th at 2:30 pm ET (7:30 PM BST). Watch live right here!

    Professional PC modder Mike Petereyns joins Scharon on the Tom's Hardware Show live on Thursday, August 13th at 3:00 pm ET (8:00 PM BST). Click here!

[SOLVED] Monitor Loses Signal When Gaming (New GPU / HDMI to VGA Adapter)

Bach0

Distinguished
May 2, 2013
166
1
18,695
1
Hi, i upgraded my GPU from AMD HD7870 to GTX 1650 Super which uses less power than my AMD card, it also has only DP, HDMI and DVI-D ports. My 1080p 60hz monitor Viewsonic VA2349S has only VGA and DVI ports ( i think this is the exact model https://www.viewsonic.com/ph/products/lcd/VA2349S.php?prv=1). I used to connect my old AMD card with passive DVI-I to VGA converter which worked fine. I bought cheap around 6$ Chinese HDMI TO VGA active adapter for my new GPU.

The problem is when i start games monitor loses signal ( goes black/ and i hear Windows USB connect/disconnect sounds until i exit the game)

I know the problem has to do something with monitor because i tested my PC for an hour hard gaming on HDTV ( connected via HDMI cable) and everything went fine. No problem with screen blackouts and new GPU works fine as well. All temps were fine.


I suspect adapter isn't high quality or monitor can't handle the GPU. I really suspect the problem is simple, but hard to find. This is what i mean. You might think monitor taps out when GPU is under heavy load, that's not the case and let me tell you why. I tested this on 2 games Lineage 2 and GTA V. Lineage 2 is an old MMORPG game which runs fine even on Onboard GPU, when i start it i get screen blackouts in login screen , but if i go past it i can play without problems, opposite with GTA V. Everything is fine until i start Story mode for example, before that i don't have problems in main menu.

I found weird thing, one time during blackouts i unplugged HDMI cable from GPU and i got crash and error from Lineage 2 which said something like this


View: https://imgur.com/w1ehHC7


After that i deleted Option.ini file from game directory and from then i can start the game and go to login screen without problems, but the graphic resolution was lowered ( if i corrected it during login screen game monitor would start blacking out again, but if i first login and then change resolution to 1080p in game everything works fine, until i exit the game and restart it) This could be the hint for solving the problem.

I would like to mention i get same screen black outs during GPU-Z Stress test when GPU reaches 60-65C but less frequently and screen comes back very soon. I have also tried different solutions like changing refresh rate of the monitor between 59-60hz in Nvidia Control Panel, nothing helped. I have set my PC on High Performance mode and all sleep/hibernate and other power saving things are disabled.

My PC specs are:

ASrock z75 PRO 3 mobo
Core i5 3570
16GB DDR3 RAM
Zotac GTX 1650 SUPER Twin fans
XFX 550 +80 WAT Bronze Certified PSU

GPU requires only one 6pin connector and reccomended power of 350 watts. Old GPU needed 2x 6 pin connectors and 450 watts.

I know it's just a guessing game, but any help is appreciated.
 

kanewolf

Titan
Moderator
Thanks you for you suggestion, i couldn't find my place you gave me hope. I'll try that, but before i have a little concern even though my monitor claims that it has DVI-D port i visually inspected it.. Surely it doesn't have 4 little pins (Like DVI-I has) around that lonely line, but the port is white-yellowish colour while my port on GPU is black (this could be no reason to fear). But i can confirm that that lonely line on monitors pin is physically way shorter than on GPU ( both claim to be DVI-D ports). I can 100% confirm that, i took pictures i hope you can see it as well. Why are they not similar if they claim to be same ports?
View: https://imgur.com/a/vs51wXs
Your monitor has a DVI-D port -- https://en.wikipedia.org/wiki/Digital_Visual_Interface#Connector
If your GPU has a DVI port then you could use a DVI-D to DVI-D cable.
 

Bach0

Distinguished
May 2, 2013
166
1
18,695
1
I would recommend you use an HDMI to DVI cable. That is an all digital interface. An HDMI to VGA requires a digital to analog conversion.
Thanks you for you suggestion, i couldn't find my place you gave me hope. I'll try that, but before i have a little concern even though my monitor claims that it has DVI-D port i visually inspected it.. Surely it doesn't have 4 little pins (Like DVI-I has) around that lonely line, but the port is white-yellowish colour while my port on GPU is black (this could be no reason to fear). But i can confirm that that lonely line on monitors pin is physically way shorter than on GPU ( both claim to be DVI-D ports). I can 100% confirm that, i took pictures i hope you can see it as well. Why are they not similar if they claim to be same ports?
View: https://imgur.com/a/vs51wXs
 

kanewolf

Titan
Moderator
Thanks you for you suggestion, i couldn't find my place you gave me hope. I'll try that, but before i have a little concern even though my monitor claims that it has DVI-D port i visually inspected it.. Surely it doesn't have 4 little pins (Like DVI-I has) around that lonely line, but the port is white-yellowish colour while my port on GPU is black (this could be no reason to fear). But i can confirm that that lonely line on monitors pin is physically way shorter than on GPU ( both claim to be DVI-D ports). I can 100% confirm that, i took pictures i hope you can see it as well. Why are they not similar if they claim to be same ports?
View: https://imgur.com/a/vs51wXs
Your monitor has a DVI-D port -- https://en.wikipedia.org/wiki/Digital_Visual_Interface#Connector
If your GPU has a DVI port then you could use a DVI-D to DVI-D cable.
 

ASK THE COMMUNITY