Noob question regarding VGA vs HDMI

psychoanalyst

Distinguished
Sep 8, 2012
15
0
18,520
I am a total noob in the Monitors department.

I currently have a Dell S2309W 23-inch Widescreen LCD monitor which I bought way back in 2009. It is a Full HD (1080p) monitor. It only takes VGA and does not have an HDMI port.

I have consistently seen that my pictures (I shoot quite a bit and have a high end DSLR) look dull and almost soft compared to what I see on my camera's display, where they "pop" and look extremely sharp.

I had the same experience with 1080p Gopro videos. They almost look like low res videos on my monitor, but they look nice and sharp on my laptop screen.

I am open to upgrading my monitor to a more modern one (it is high time anyway and I am considering the Lenovo - L24q-20), but I was looking for an explanation on what about the monitor is causing this resolution/clarity loss? Is it the VGA interface? After all it is advertised as a Full HD monitor.

I am assuming that a VGA to HDMI converter will have no benefit at all? I do have a GeForce 750 Ti graphics card.

Would appreciate any inputs and any thoughts on the replacement monitoring I am considering.

Thanks!

Avi
 

Hardware Brad

Notable
Jul 24, 2017
421
0
960
Look into a monitor calibrator. Calibrating your monitor with something like a Spyder 5 will help a lot with your color quality, especially with photography.

To answer your question though, the big differences between VGA and HDMI are that VGA is an analog signal, and HDMI is digital. VGA has a max resolution of about 2k, whereas newer HDMI standards can go up to 4k. At 1080p there is no difference between HDMI and VGA as far as resolution.

Buying a new monitor might make a difference, however, it still may not come perfectly calibrated out of the box from the factory. Which is why I recommended the calibrator.
 

molletts

Distinguished
Jun 16, 2009
475
4
19,165
Firstly, double-check that your PC is actually set to 1920x1080 in case it's changed itself for some reason, such as a badly-behaved update. Then get a good high-contrast image up on the screen (an Explorer view of a large folder is good - lots of sharp, high-contrast text across the whole screen - some monitors need more reference data than others) and get the monitor to do auto image adjustment. If that doesn't improve the sharpness, perhaps try a better VGA cable - 1920x1080 is pushing a cheap, thin analogue VGA cable quite hard. (Re-seat the connectors on both ends in case they've got dirty/oxidised or something.)

I've seen a few Dell monitors that lose their auto-image adjustment every time the system starts up (actually, what they do is to try to auto-adjust on the black screen that comes up before the login screen, where there are no reference points for them to adjust to, so they get it wrong) so you may need to either auto-adjust every time or have a script that changes the resolution to something else then changes back again (forcing the monitor to resync) after you log in.
 

psychoanalyst

Distinguished
Sep 8, 2012
15
0
18,520


Yes! The monitor indeed has a DVI connection and so does my GeForce card. So I just purchased a DVI cable. Will see if that improves the quality.

Thanks for pointing that out!