My graphics card detects 2 monitors when there is only one and when I add a second monitor to use the extend mode, it wont all

tmatt746

Reputable
Jun 27, 2014
1
0
4,510
I have recently bought a graphics card for a few purposes. I have 2 different monitors, neither is hdmi, and I was hoping to be able to use dual monitor through extended mode. At first I only plugged in one monitor. I checked its resolution and the graphics card seemed to have detected 2 monitors. My main monitor that I use was also identified as monitor 2. the other non existent monitor would change when i changed the projection to extend. It would then change resolution to 640x480 and read "Your resolution is lower than 1024x768. Some items might not fit and apps might not open." So this was the first issue. I though plugging in my other monitor would fix it but it stayed the same. I have no idea how to fix this or if its even possible but all help would be appreciated.
 


What graphics card are you using?? And what native resolution does your gpu support?? The possible reason why you are seeing 2 monitors isn't the graphics card problem. This is a problem due to windows itself for detecting dedicated gpu and a onboard gfx on mobo. I use an R9 270 and when i go to "screen resolution --> then click detect" windows will also detect a second monitor that is not attached. That "second monitor" is your VGA output on your mobo that is disabled by default when you use PCIe.

Ignore that second one and don't extend, because if you are using DVI or HDMI extending doesn't do anything since VGA is not compatible with higher resolutions. That is the reason why you are getting error message.