Question Everything on the monitor screen is shaky and kind of like vibrating

afratafri202

Honorable
Oct 19, 2017
39
0
10,530
Hello Everyone!
I have the following PC:
HP Elite 8100 CMT
Core i5 650 3.2 GHz
4GB 1333Mhz RAM
1X 250GB 5400RPM HDD
1X 160 GB 7200RPM HDD
1X 320GB 5400RPM HDD
Sapphire R7 260X OC
I have an Acer AL1923 19inch 1280 x 1024 60Hz Monitor which is fairly old.
Now the problem is when I set the monitor resolution to anything other than 1280 x 720
The screen shakes and vibrates (not physically).The image seems fuzzy and is annoying.
Now I know the monitor is not a faulty one coz I tried my 32 inch LED TV and the same thing happens.It once started to work fine with the monitor auto adjust but that doesn't work either.
When I uninstalled the AMD Drivers everything was fine but everything is messed up when the drivers are installing.This may be a driver issue but I think it might be the video card.
Any help is appreciated.
 
If you use the DVI-D interface, then the video card should be able to find settings automatically. If you use the VGA interface, then you'll probably have to use some sort of "driver" disk to set up. Are you using DVI-D or VGA? Note that even if setup is correct that VGA is a purely analog method of doing things and won't have the quality of digital. You get to skip analog to digital and digital to analog "RAMDAC" (see https://en.wikipedia.org/wiki/RAMDAC) when going digital, plus digital self-describes the monitor instead of requiring that driver disk. Some people might try to tell you that VGA later added a DDC pass through for auto configuration, but don't ever count on that being implemented, it is extremely likely VGA won't self configure.

There are a lot of things which can go wrong and produce a fuzzy video, but analog components working at their limits are always the number one suspect. Best to get rid of the analog components and go purely digital.
 
  • Like
Reactions: afratafri202

afratafri202

Honorable
Oct 19, 2017
39
0
10,530
If you use the DVI-D interface, then the video card should be able to find settings automatically. If you use the VGA interface, then you'll probably have to use some sort of "driver" disk to set up. Are you using DVI-D or VGA? Note that even if setup is correct that VGA is a purely analog method of doing things and won't have the quality of digital. You get to skip analog to digital and digital to analog "RAMDAC" (see https://en.wikipedia.org/wiki/RAMDAC) when going digital, plus digital self-describes the monitor instead of requiring that driver disk. Some people might try to tell you that VGA later added a DDC pass through for auto configuration, but don't ever count on that being implemented, it is extremely likely VGA won't self configure.

There are a lot of things which can go wrong and produce a fuzzy video, but analog components working at their limits are always the number one suspect. Best to get rid of the analog components and go purely digital.
I use DVI I to VGA and I do have another HDMI to VGA coz my monitor only supports either VGA or DVI I
Thanks for the help and does that mean that the video card is not faulty.
 
I use DVI I to VGA and I do have another HDMI to VGA coz my monitor only supports either VGA or DVI I
Thanks for the help and does that mean that the video card is not faulty.

It probably means both monitor and card are working, but settings for the monitor are not correctly matched in the video card. Unless you have a way to set the video card to the exact mode of the monitor you won't be able to do any better. Modes have a lot more to them on analog monitors than refresh rates, and current settings are probably just some average. You will probably need to abandon VGA at some point if you get the chance.