Hi everyone, this is my first post here so I hope I make everything as clear as possible and make no mistakes.
I have a rather strange issue imo. I just hooked up my second display and I get a strange issue, the image seems to flicker, they best way I can describe it is that it looks like the monitor is being filmed by an old camera, not as intense, but something like that and it's especially visible on darker colors.
Strange thing is that as soon as I unplug the main monitor it becomes normal again. Also, if I lower the resolution to 1280x1024 it stops flickering- the native resolution of this monitor is 1680x1050. I also notice that this is the point where the frequency automatically changes to 75hz from 60. If I manually turn it to 60 hz it flickers again.
It works fine if the main monitor is not connected even at 60hz but as soon as I plug the main monitor it starts again.
I tried forcing it to go to 1680x1050 at 75 hz in the nvidia control panel. When I do the test it works fine, the resolution is sharp and no flicker, but that's only when I test it. When I hit "ok" and have it apply it messes up again, and now, it shows me that the resolution is 1680x1050 but the image is not sharp anymore, it's almost distorted.
I can't seem to find anything by googling this, it's just a strange issue I think.
My main monitor is a Samsung SyncMaster T260 HD - this one causes no problems
My second monitor is an Acer P223w
My ghrapchis card is a GeForce GTX 670.
Hope you guys can help me, thank you in advance!
I have a rather strange issue imo. I just hooked up my second display and I get a strange issue, the image seems to flicker, they best way I can describe it is that it looks like the monitor is being filmed by an old camera, not as intense, but something like that and it's especially visible on darker colors.
Strange thing is that as soon as I unplug the main monitor it becomes normal again. Also, if I lower the resolution to 1280x1024 it stops flickering- the native resolution of this monitor is 1680x1050. I also notice that this is the point where the frequency automatically changes to 75hz from 60. If I manually turn it to 60 hz it flickers again.
It works fine if the main monitor is not connected even at 60hz but as soon as I plug the main monitor it starts again.
I tried forcing it to go to 1680x1050 at 75 hz in the nvidia control panel. When I do the test it works fine, the resolution is sharp and no flicker, but that's only when I test it. When I hit "ok" and have it apply it messes up again, and now, it shows me that the resolution is 1680x1050 but the image is not sharp anymore, it's almost distorted.
I can't seem to find anything by googling this, it's just a strange issue I think.
My main monitor is a Samsung SyncMaster T260 HD - this one causes no problems
My second monitor is an Acer P223w
My ghrapchis card is a GeForce GTX 670.
Hope you guys can help me, thank you in advance!