[SOLVED] Monitor shows no signal at 1280x1024 resolution(the default one)

bruh1

Reputable
Dec 7, 2019
23
0
4,510
So yesterday I had an old card, i used a dvi to vga converter to connect it to my monitor.It ran fine at 1280x1024.
I bought a new card today, the dvi converter doesnt work here so I bought a hdmi to vga converter, installed new drivers. Everything is working fine, except whenever i set my resolution to 1280x1024 i get a "no signal" error. The recommended resolution is set at 1024x768. Any help would be appreciated.
Also side note, if i set my resolution to 1440x900, it just works though it looks weird and blurry.
Edit: Just ran a game and it works fine at 1280x1024 resolution. wth is happening
Edit 2: ran dota 2 at 1280x1024, still "no signal error" what more is that im not getting the performance i expected. according to official benchmarks i should get 90fps at ultra settings in dota 2 but im only getting 40 fps at low.
 
Last edited:
Solution
Adapters and HDMI can add an extra layer of headache. You might need a different/better cable to get the performance you were expecting. I have a 20' HDMI cable that can connect my laptop to my TV. Works fine doing basic windows stuff. But if I try to watch a video the whole thing just turns green. Doesn't happen with my newer 6' cable. It could be a cable like my issue was, or it could be the adapter still or anything else. Limp along as best you can, but only an upgrade with fix everything.

4745454b

Titan
Moderator
There are three different DVI standards. DVI-A, DVI-I, and DVI-D. I've never seen DVI-A in the wild and only seen the plans/pin outs for it online. I wouldn't worry about DVI-A. Most video cards used to use DVI-I. The I stands for "integrated". Meaning it carries a digital and analog signal. (The A stands for Analog and it only carries an Analog signal.) Current gen cards have moved entirely to digital only which means the ports are all DVI-D. There is no analog signal being passed through the DVI port anymore like with DVI-I which means all those old simple DVI to VGA adapters we had so many of don't work. What you need is an ACTIVE DVI (or HDMI if you want.) to VGA adapter. They are costly, usually around $15-20.

What monitor are you using? If it's 1280x1024 you really need a new one. Adapters can get you up and going again, but with all the cool things digital gets you you might as well upgrade. 1080 or better is where its at now. I wouldn't bother with anything else. You can get a simple 1080 monitor for $80-$100. I got a good Freesync, which won't work over VGA, monitor onsale for $105? If you are buying new GPUs and what not don't hook it up to a 20yo monitor.
 

bruh1

Reputable
Dec 7, 2019
23
0
4,510
There are three different DVI standards. DVI-A, DVI-I, and DVI-D. I've never seen DVI-A in the wild and only seen the plans/pin outs for it online. I wouldn't worry about DVI-A. Most video cards used to use DVI-I. The I stands for "integrated". Meaning it carries a digital and analog signal. (The A stands for Analog and it only carries an Analog signal.) Current gen cards have moved entirely to digital only which means the ports are all DVI-D. There is no analog signal being passed through the DVI port anymore like with DVI-I which means all those old simple DVI to VGA adapters we had so many of don't work. What you need is an ACTIVE DVI (or HDMI if you want.) to VGA adapter. They are costly, usually around $15-20.

What monitor are you using? If it's 1280x1024 you really need a new one. Adapters can get you up and going again, but with all the cool things digital gets you you might as well upgrade. 1080 or better is where its at now. I wouldn't bother with anything else. You can get a simple 1080 monitor for $80-$100. I got a good Freesync, which won't work over VGA, monitor onsale for $105? If you are buying new GPUs and what not don't hook it up to a 20yo monitor.
I do plan on getting a new monitor but for now i just have to make do with this one. The thing is i am currently using a HDMI to VGA adapter which can only be active. What is really happening is that i cannot switch to my monitor's native resolution and im not getting the expected performance increase even though AMD says that the right drivers are installed.
 
Last edited:

4745454b

Titan
Moderator
Adapters and HDMI can add an extra layer of headache. You might need a different/better cable to get the performance you were expecting. I have a 20' HDMI cable that can connect my laptop to my TV. Works fine doing basic windows stuff. But if I try to watch a video the whole thing just turns green. Doesn't happen with my newer 6' cable. It could be a cable like my issue was, or it could be the adapter still or anything else. Limp along as best you can, but only an upgrade with fix everything.
 
Solution

bruh1

Reputable
Dec 7, 2019
23
0
4,510
Adapters and HDMI can add an extra layer of headache. You might need a different/better cable to get the performance you were expecting. I have a 20' HDMI cable that can connect my laptop to my TV. Works fine doing basic windows stuff. But if I try to watch a video the whole thing just turns green. Doesn't happen with my newer 6' cable. It could be a cable like my issue was, or it could be the adapter still or anything else. Limp along as best you can, but only an upgrade with fix everything.
Welp time to get a new dvi-d adapter i guess. Thanks for the help.