I just bought a Gigabyte GTX 260 overclocked version, run's great. Mount and blade runs at 100 fps instead of usual 20 on my old 7950GT which died.
The problem is since installing it I have four monitors listed in device manager - three "default monitors" and one "plug and play" monitor. The card itself has one DVI , one analog (VGA), and one HDMI output. What happens is that everytime you power-on, the computer decides to use a different monitor. I want it to be connected to my digital display permanently and not use the analog. What I have to do is if there's no display on the DVI I have to switch the plug to the VGA output and so on, for whichever one it decides to use everytime I turn on the computer. Uninstalling extra displays in device manager doesn't help as they only come back again. Setting which primary monitor you want in windows also doesn't keep it set for the next time.
I read on gigabyte website that flashing the bios may be a fix, due to incompatibility with certain monitor types.
"When using DVI output of the graphic card, sometimes there is no display on screen." - http://www.gigabyte.com.tw/Support/VGA/FAQ_List.aspx?FAQID=3330
But I don't know if flashing the vga bios is a good solution, I try to avoid it where I can. I thought perhaps simply disabling the extra monitors in device manager may fix it.
The problem is since installing it I have four monitors listed in device manager - three "default monitors" and one "plug and play" monitor. The card itself has one DVI , one analog (VGA), and one HDMI output. What happens is that everytime you power-on, the computer decides to use a different monitor. I want it to be connected to my digital display permanently and not use the analog. What I have to do is if there's no display on the DVI I have to switch the plug to the VGA output and so on, for whichever one it decides to use everytime I turn on the computer. Uninstalling extra displays in device manager doesn't help as they only come back again. Setting which primary monitor you want in windows also doesn't keep it set for the next time.
I read on gigabyte website that flashing the bios may be a fix, due to incompatibility with certain monitor types.
"When using DVI output of the graphic card, sometimes there is no display on screen." - http://www.gigabyte.com.tw/Support/VGA/FAQ_List.aspx?FAQID=3330
But I don't know if flashing the vga bios is a good solution, I try to avoid it where I can. I thought perhaps simply disabling the extra monitors in device manager may fix it.