Hello,
So I recently upgraded my graphics card to HD 7850. I have a few years old Hp LCD monitor. Before my gpu upgrade I have always used the VGI connector and didnt have any problems. But now that I try to use a DVI, monitor flashes periodically to black and shows quickly that the DVI signal input is
out of range and I should change it to 1280x1024@60hz. I dont understand whats the problem because the resolution is set at these specs. If I connect
the VGI standart everything is ok.
Yesterday at one moment the flashing wasnt very often maybe once in a few minutes, I managed to even test performance in some games and it fixed the problem with very hard to notice wave like flickering in some games which happens with a VGI standart. This flicker becomes less noticable if vsinc is enabled.
Anyway would love some quick help!
Matiss
So I recently upgraded my graphics card to HD 7850. I have a few years old Hp LCD monitor. Before my gpu upgrade I have always used the VGI connector and didnt have any problems. But now that I try to use a DVI, monitor flashes periodically to black and shows quickly that the DVI signal input is
out of range and I should change it to 1280x1024@60hz. I dont understand whats the problem because the resolution is set at these specs. If I connect
the VGI standart everything is ok.
Yesterday at one moment the flashing wasnt very often maybe once in a few minutes, I managed to even test performance in some games and it fixed the problem with very hard to notice wave like flickering in some games which happens with a VGI standart. This flicker becomes less noticable if vsinc is enabled.
Anyway would love some quick help!
Matiss