Will 1600x1200 be displayed pixel perfect on a 1920x1200 monitor?

Apr 10, 2018
1
0
10
I want to play some older games from mid 2005 that did not support widescreen and I want to play them in the original 4:3 aspect ratio as they were intended, so no changing of config files.
They all seem to support 1600x1200, so my question is, if I set the resolution to 1600x1200, will it basically be diplayed 1:1? I know it will have black bars on the sides due to the different aspect ratio.
 
Solution
Yes, provided you can set your monitor to not stretch the image horizontally to fill the screen. That means 4:3 or Normal, instead of any Wide, Full, or Zoom.
Nope. Assuming you have a "normal" LCD monitor your pixels are "fixed" in place. This means the monitor will start in the normal position to handle the 1200, but will have to "guess" where to draw the 1600 on your 1920 horizon. It's called interpolation.

https://en.wikipedia.org/wiki/Interpolation

Because it's having to guess, the image can appear fuzzy. Like with AA or refresh rate, not everyone will notice or be effected the same way. You might not even care. It won't cost you anything to try and see. Do you really have a 1920x1200 monitor? Most LCDs are 1920x1080.
 
There's no scaling involved. Either the monitor can accept and understand the signal or not. It won't guess and make up data that isn't there, rather it starts drawing from the top left corner and then centers the image afterwards--I have an LCD that's so old and slow you can see it move there after a second.

I can run 640x480 on all of my fixed pixel displays, even the 4k OLED, and it just shows up as a tiny box in the center of the screen at a perfect 1:1.

Those vintage console gamers however still hoard their CRTs, because that's truly the only way to play those games as they were intended. Even after turning off any image processing, LCD still has slower response time than CRT, so old games that rely on precise timing of button presses will just play differently. Plus of course light guns only work on CRT.
 
I'm taking about which pixels to light up. It will have to use interpolation to "guess" which ones to light. It does this every time it has to use the non native res. (Unless it's a perfect subset of the native res, like 1080 is for 4K.) Unlike a CRT monitor the pixels are fixed and can't be moved.
 
Playing a game at 1600x1200 resolution on a 1920x1200 (16:10 aspect ratio) will result in pixels being displayed at 1:1 with black bars on the left and right side of the monitor.... assuming the monitor does not automatically stretch the image to fill the entire screen. If it does do that, then there should be a setting in the monitor to disable "auto stretch".
 
Yup, isn't that exactly the point? By not rescaling, there is no need to remap the smaller resolution to try to fit a larger grid that may even have a different aspect ratio, and hence no guessing.

Stretching it would be a non-native resolution made to fit, but 640x480 in a tiny box in the middle of the screen is actually running at native resolution just with all of the extra pixels unused. That's why it's in a tiny box.

If the monitor's EDID has the resolution in it, then it should know what to do with it. And it should, given 1600x1200 is one of the EDID Additional standard timings.