VGA adapter for GPU

May 29, 2017
20
0
510
I recently bought a GTX 1050 ti but have a VGA monitor. I bought a displayport to vga adapter with it but it has the max resolution of 1280x1024 but my monitor's native resolution (should be) is 1600x900. Now my monitor's native resolution is set to 1280x1024 which is causing a bunch of troubles. I just need advice what to buy next time, can I just buy a Displayport to vga adapter (or dvi to vga or hdmi to vga) that supports resolutions UPTO 1920x1080? Will it then detect my monitor's native resolution as 1600x900? or am I completely cluttered.
 
VGA is analog so doesn't work in terms of pixels. It works in horizontal and vertical scan rate. The vertical scan rate how many times per second it sweeps vertically (basically the refresh rate). The horizontal scan rate is how many times per second it sweeps horizontally. Divide it by the vertical scan rate and you get how many horizontal sweeps there are per vertical sweep. That's how many lines it can draw per refresh (screen redraw). So if the max horizontal sweep is 900x the desired refresh rate, the adapter can put out a max 900 lines of pixels. Horizontal pixels are encoded as signal modulation within a single horizontal scan.

In theory, the 1280x1024 adapter should be able to do 1600x900 over VGA. Since its horizontal scan rate is able to do 1024 lines of resolution, it should be able to do 900 which is fewer. The horizontal modulation aka pixel clock (for horizontal pixels) is higher, but the total bandwidth (in pixels per second) is lower if you drop the refresh rate down to 50 Hz. The manufacturer must not have programmed the adapter to support resolutions like 1600x900 @ 50 Hz. Or maybe you need to finagle Windows into outputting 1600x900 @ 50 Hz over displayport before the adapter will do it over VGA.

Buying an adapter which supports 1920x1080 over VGA will most likely work. But it depends on the horizontal and vertical scan rates the adapter supports on the VGA side. If the adapter you get only supports 1920x1080 @ 30 Hz, then it's not going to have enough bandwidth to push 1600x900 @ 60 Hz. Even if it has enough bandwidth (e.g. supports 1920x1080 @ 60 Hz), you're going to have to gamble that the manufacturer programmed in support for 1600x900. So there's a lot here which can go wrong even if the adapter you get exceeds the specs required for 1600x900 @ 60 Hz. I suggest buying from some place which allows free returns (with no restock fee) if it doesn't work.

Yes, analog video is complicated. That's why we switched over to digital video.
 
May 29, 2017
20
0
510


Thanks for the reply! But can I get a bit of an walkthrough, I just bought a hdmi to vga adapter that supports 1920x1200 p with a 60hz refresh rate, but it works worse than the previous one. With the previous one I was able to use CRU to add a 1600x900 60 hz resolution which works fine but since the native resolution is messed up, many games are not playable in 1600x900 (without black bars) and no games are playable at 1280x720 (without black bars), surprisingly. I honestly have no idea what to do and I'm really helpless, if you could even help me by telling me what to do for help I'd be glad. Thank you.


UPDATE : also to add, I might've figured out the issue but I really need some confirmation. My old adapter was of a 4:3 resolution support, this new one (1920x1200) is 16:10. My monitor's native resolution(should be) is 16:9. Maybe I should buy a 16:9 VGA adapter?

 

When in 1600x900 @ 60 Hz, hit the auto-adjust button on your monitor. (Sometimes it's buried in the monitor's menu options.) This will force your monitor to measure the input signal and "learn" how to synchronize to this exact signal, so the pixels and borders line up right, and eliminates any black bars.

Repeat when you're displaying 1280x720. You have to do this at each resolution and refresh rate you intend to use for the monitor to "learn" how to map the incoming analog signal to the screen.

Repeat it again if you encounter similar problems with a full-screen game. Sometimes the 1600x900 @ 60 Hz a game uses is not exactly the same as the 1600x900 @ 60 Hz that Windows is using, and your monitor needs to learn the second variant the game is using.

As I said, analog video is very complicated. The auto-adjust button handles about 4 different settings you used to have to adjust manually by turning knobs back in the old days.

UPDATE : also to add, I might've figured out the issue but I really need some confirmation. My old adapter was of a 4:3 resolution support, this new one (1920x1200) is 16:10. My monitor's native resolution(should be) is 16:9. Maybe I should buy a 16:9 VGA adapter?
It's not the aspect ratio. The adapter manufacturer has to program in support for the exact resolution and refresh rate (horizontal and vertical scan rates and pixel clocks). If they didn't program in 1600x900 @ 60 Hz, the adapter won't be able to do it.
 
May 29, 2017
20
0
510


Alright thanks for all the info, I've actually tried the auto-resize but with no luck. I just have 1 last question and I'll stop bothering. How is the native resolution of a monitor determined? My monitor's native resolution is supposed to be 1600x900 60hz (16:9) but with the old adapter, my native resolution was 1280x1024 (4:3) and with the new one, it's 16:10. Which adapter do I buy to make sure my native resolutions stays 1600x900?

 
The monitor's native resolution is how many pixels its panel has. So your monitor being 1600x900 native means the screen panel physically has 1600x900 pixels.

With analog VGA, there's no way for the GPU (or adapter in your case) to determine the monitor's native resolution. A few monitors report their current resolution to the GPU, but those are relatively rare (Apple pushed for this back in the 1990s). What's more common (on VGA monitors) is for the monitor to report its max horizontal and vertical sync speeds. My guess would be your monitor is actually capable of 1600x900 @ 70 Hz, and the adapter misinterpreted that to mean 1050 lines of vertical resolution @ 60 Hz (900*70/60 = 1050). And the closest matches to that vertical resolution were 1280x1024 (5:4) and 1680x1050 (16:10). Both were old standard resolutions.

The exact resolution is generated by going through that whole complicated vertical sync speed, horizontal sync speed, pixel clock thing. The adapters were probably making their best guess, and counting on Windows to offer and you to select different resolutions. If Windows doesn't offer 1600x900 as an option, adding it with CRU is pretty much the only way (it's possible to add it by hand, but it's a huge headache - I've done it before - and can destroy your monitor if you use the wrong settings. CRU us much safer). If the game doesn't offer it, then I dunno what else you can do. Some games let you add a custom resolution by manually editing some config files.