Out of range error from monitor

Apr 3, 2018
22
0
10
Hi guys

So i just bought this new system:

Asus h110m-r
I3 7100
8gb ddr4
Gtx 1050ti

I used to get a resolution of 1600×900 with my old hd5450
But now the max resolution i can get is 1440×900 and when i set the resolution to 1600×900 the screen turns off and says:
Out of range

Note:My monitor deos not have a hdmi port so i used a hdmi to vga convertor.
 
What is the native resolution of the monitor?

Most hdmi-vga adapters need to be powered as hdmi is a digital signal and vga is analog, so needs to be converted physically, not just adapted by different pins, like is possible by dvi-i to vga. Unfortunately the gtx10xx series cards have dropped dvi-i support and are totally digital with only hdmi, dp, dvi-d now.

So if the converter is not powered or its expecting a digital input (you can't run a vga > hdmi backwards, only a hdmi >vga will work) and getting analog instead, it'll mess things up.
 
Can't say. Just looking online there's multiple adapters, of both kinds, hdmi>vga and vga>hdmi and you need to be certain you actually do have hdmi>vga as that's the direction of flow from card to monitor.

Also, if the monitor came with a drivers disc, or you can find one, that might help as your new pc may not have specific drivers for that resolution as such, being as the card is digital and not digital/analog.

I'd not 'just live with it' until exhausting every possibility, sometimes there's just no quick and easy fix, takes some time and frustration.