Using HDMI Can't Go Above 1024 x 768 Res on Windows 10

Status
Not open for further replies.

IanS95

Reputable
Nov 5, 2015
18
0
4,510
Hello all, I have a strange issue I'm hoping you can help with.

I just bought a LG 27MP48HQ-P 27" Monitor, and it has a VGA and HDMI input. It's max res is 1920 x 1080. I connected it to my Windows 10 desktop via the HDMI port of my Gigabyte Windforce 970 graphics card. The res was stuck at 1024 x 768, but I figured I could install drivers for it. The drivers supplied by LG on the CD did not work, it said it installed the driver but when referencing Device Manager I still had the generic non-php monitor driver. I also tried downloading from LG's website for that exact model, with the same result. It is a strange driver setup, you run an executable and pick the model, then it just has a popup that says "driver installation successful."

I've tried manually updating in Device Manager by browsing to the driver file and installing, but that makes no difference. Rebooting, searching online through Device Manager, and un-installing and reinstalling the driver in Device Manager don't work. I'm getting really ticked at this, it should be simple, but it isn't. Any help would be greatly appreciated! :)
 
Solution

If it works correctly on another monitor/TV, then yes, your issue is your monitor. This rules out any HDMI or driver issue.

The internal video card of the monitor is probably defective and only connects at 1024x768. I have seen this issue before, but it's pretty rare. You can try upgrading the firmware if possible. If not, then return it if...
Hello... Monitors don't use drivers... They are a passive device... Typically it will add a name to the Device manager PnP monitor listing... but it also can provide addition screen resolution options for Windows OS. Anytime you load something in Windows, it is best to completely power down the MB/OS and re-boot... before changes take affect.

Basically the Master Display/Setting APP for the OS is... Right click Desktop-Screen resolution-Resolution

Typically all you need to do is set the Control panel-Nvidia App for performance and no other settings changes are needed.

 

IanS95

Reputable
Nov 5, 2015
18
0
4,510
Well, I did try rebooting and "cold booting" with the monitor connected via HDMI, with stock and updated drivers, and it only listed 1024 x 768 and 800 x 600 as resolution options.
 
Hello... OK... now right click the "Blue" advanced settings-Adaptor-List all modes, can you see/set your monitor for 1920 by 1080, True Color (32bit), 60 Hertz there?

This is in the location " change the appearance of your displays" I showed you on previous post. Right click Desktop-Screen resolution
 

IanS95

Reputable
Nov 5, 2015
18
0
4,510


I have not tried a different HDMI cable yet, I will have to get my hands on one. It was just one of those Amazon Basics one that I picked up for 2.99. Right now I'm using a DVI to VGA adapter and then running VGA into the monitor, and it DOES let me select 1080p. The resolution is at least nice, but the colors are quite drab and I know I'm probably missing a lot of quality this way.
 

IanS95

Reputable
Nov 5, 2015
18
0
4,510


This is my result, the highest res is still only 1024. http://imgur.com/a/hBquC

 
Hello... Make sure your HDMI cable is Jacket labeled HDMI 2.0 as spec'd/listed for your Monitor... cables can look the same but there is a internal manufacturing and electrical/communications differences... it could be that simple, the OS is not getting the proper communications from the monitor to See it. B / that $2.99 HDMI ver? cable could be causing all your current frustration and problems right now. B o

It seems your monitor "driver/info" file didn't unlock the OS adaptor modes... if you are windows 10... I would suggest going to the "hardware suggested" updates and see if the OS wants something from MS to un-lock or finish the "driver install" ?

Have you tried installing the "driver/info" file from/into the Device manager-PnP monitor shown/listed... so the OS knows it for that Hardware device listed?
 

IanS95

Reputable
Nov 5, 2015
18
0
4,510


It actually shows up as LG IPSFULLHD(Analog). I don't think that is the drivers I tried installing, because I deleted it and re-installed (by scanning for hardware changes) and it still comes up.
 

IanS95

Reputable
Nov 5, 2015
18
0
4,510


Yes sir, I did, by choosing to update the driver, manually select, have disk, and select the file I downloaded from LG. After doing that would I need to shut off and then boot up? I know I restarted, I don't think I "cold booted."

 
Hello... You need all the proper SPEC'd cables for your hardware... This is digital FAST communications between hardware... You can't cut corners here or cheat... if it says HDMI 2.0 then you need HDMI 2.0 cable and the communications that come with that cable model/spec. I know they make them all look the same on the outside and they can Plug right in... BUT there is more going on underneath the skin or the jacket. B /

See the color chart below to see all the different types of communication, and modes with different HDMI version types...
https://en.wikipedia.org/wiki/HDMI

The same goes for DVI and Display port... they all look the same... but the internals will be different between versions... the idea here was that the hardware connector used, to be made common... and no redesigns for the manufacture or circuit board mounting, for the connectors... But the wiring/communications/signals inside your hardware/Cable is changeable.

 

IanS95

Reputable
Nov 5, 2015
18
0
4,510


Yep, I definitely have Nvidia drivers installed, GTX 970. I just installed them a couple days ago. I can post a screenshot if you want but it isn't the Intel HD Graphics :p
 

Natsukage

Estimable
Oct 28, 2016
1,264
0
2,960
Okay. First of all, it doesn't matter if your cable is HDMI 1.0 or HDMI 2.0, Windows should detect the max resolution of the monitor correctly. The wires have absolutely no say in this.

Have you tried fiddling with the Nvidia Control Panel menu instead of the windows controls? You can make custom resolutions in the nvidia software if windows doesn't want to play nicely. Select Nvidia control panel -> Display -> Change Resolution

There is always a possibility either your monitor or video card have issues with the HDMI. Do you have a spare monitor to compare with?
 

IanS95

Reputable
Nov 5, 2015
18
0
4,510


Yep, I checked in Nvidia control panel as well, but again, it was the same story of max 1024. I appreciate all the hints, and I've tried nearly all of them since I'm in IT and I can usually troubleshoot things but this has got me stumped.

I hooked up my TV, which has a max resolution of 1360 x 768, and Windows maxed that out no problem with HDMI out of my card. So maybe its my monitor?

 

Natsukage

Estimable
Oct 28, 2016
1,264
0
2,960

If it works correctly on another monitor/TV, then yes, your issue is your monitor. This rules out any HDMI or driver issue.

The internal video card of the monitor is probably defective and only connects at 1024x768. I have seen this issue before, but it's pretty rare. You can try upgrading the firmware if possible. If not, then return it if possible.
 
Solution
Hello... What gets me is the SPEC sheet shows HDMI 2.0 as the input type... if this was non-issue, then why did they list it that way? B / they could have just said HDMI input... I would suggest calling tech support on this "fine print" item on their Spec sheet and your current situation... it will cost you $$ to ship this item one way for RMA, The $$ lost if you are wrong, will be out of your pocket B /

It could very well be a defective monitor, but I've installed a lot of equipment, and these Spec sheet specifics are important to the ENG, in order to order the other hardware you need to get the system running as needed.

Another monitor tested with the same cable will prove nothing in relation to your new monitor and the possible wrong spec'd cable? connection for it... only for the GPU, OS, cable, and spare monitor hooked up at the time.

You could try the Monitor, software and cable on another GPU/System/OS install, out of curiosity, if you have a nearby friend/family member system. B /

Monitors just like anything else (TV's ) are ever changing, connectivity, wireless, data ports, networking and communications... ARM processors are being installed into about everything... it a Brave New World of New Connectivity Standards and Specs... you can't Upgrade one item any more with out upgrading the other things it is connected to... B /


 
Usually you'll either get no signal, or a "sparkly" picture with a bad HDMI cable. With a Cat 1 vs Cat 2 it sometimes works, or will only work at lower resolution or higher res, low refresh. I thought possibly the detection over Cat 1 might be causing this, but it doesn't look like it. For whatever reason, the monitor isn't being correctly identified over HDMI.
 

IanS95

Reputable
Nov 5, 2015
18
0
4,510
It is very strange. I hate to think that brand new monitor simply doesn't work with HDMI. It is plain stupid, especially since it ONLY has VGA otherwise. I've never compared them, but I suppose DVI converted to VGA is pretty crappy quality compared to straight HDMI? I've tried gaming and stuff, and it looks ok quality, although the blacks/grays seem a bit off. Sounds like I'm stuck between RMA'ing this and dealing with mediocre quality.. meh. Thank you very much for all the help guys!
 

Natsukage

Estimable
Oct 28, 2016
1,264
0
2,960
I suggest trying out another monitor or TV that uses 1080p to confirm that your card outputs 1080p correctly in HDMI, before RMAing it. If it does, then the monitor is without a doubt the culprit.

Just to be exact, the DVI to VGA output of the GTX 970 isn't converted at all. It is a direct VGA output on the DVI-I connection of the card.
For a good VGA signal, you need a quality adapter and a quality VGA wire. (Not a cheap thin wire.) 1080p needs those to have a good quality. But you should know that the black/gray should be unaffected by the input, so it's probably like that even in HDMI. Try adjusting it in the menu (contrast/brightness/backlight)

Most modern monitors do not have the quality parts and converters needed for a good VGA input though, and are made with HDMI in mind first.
 

IanS95

Reputable
Nov 5, 2015
18
0
4,510


Ok so this is where the plot really thickens. I hooked my monitor up to my xbox 360 via HDMI, and changed the display to 1080p no problem! So it is definitely a software problem.. now I just have to figure out what...Unless of course it is the card in which case I'm toast since I bought it used.

 

Natsukage

Estimable
Oct 28, 2016
1,264
0
2,960

It can still be the monitor though, because unlike a pc, the X360 doesn't care what it outputs. If your picture isn't there and the monitor doesn't support the res, it'll simply revert after 30 sec.

Have you tried custom resolutions in the nvidia software, as well as disabling the "hide resolutions" box as madmatt suggested?

 
Status
Not open for further replies.