Question Why is my secondary monitor detected as a TV rather than an analogue display?

dan2804

Distinguished
Mar 1, 2014
29
0
18,560
Hi guys,

So I'm a bit stumped here.

I have a windows XP machine (I know it's outdated but it's a special machine that runs hardware that doesn't run on anything but XP) that was running a passive GT9400.

Has two displays connected one to the VGA port and one to the DVI port (although its a VGA display).

I did an upgrade to a Zotact GT710 1GB as it runs a lot cooler then the GT9400.

Plugged in my monitors, updated the drivers and then I hit my problem.

The monitor connected to the VGA port is detected as a TV rather than an analogue display and as a result looks really blurred in comparison to the display connected via the DVI port that looks very sharp.

I've tried swapping the ports the displays connect to and it seems no matter what monitor out of the two I have identifies as a TV when connected via the VGA port.

Any ideas?

I've tried everything I can think of.
Ran DDU and did a complete clean install, changed out the cables, ran a forced detection in the nvidia control panel. I just can't get it to work properly.

As soon as I put the GT9400 back in, the displays detect fine.

Help!

Thanks

Dan
 
D

Deleted member 1272431

Guest
How about just accepting it as a TV and try changing the settings? When you say sharp and not sharp I’m thinking resolution is not set right. Could be the adaptor confusing the GPU but, monitor or TV should both look fine. Try comparing the setting for both monitors. Also, plugging only the “blurred” one, is it better? Does it still detect it as a TV?
 

dan2804

Distinguished
Mar 1, 2014
29
0
18,560
Hi,

The resolution is set correctly at 1024x768 (the program needed for the hardware doesn't work at any other resolution and these monitors shipped with the hardware)

The monitor that connects via the DVI with a VGA to DVI adapter works totally fine (both work fine when connected via the DVI port).

The monitor connected directly to the VGA port is the one that detects as a TV (no adapter involved there, just straight VGA to VGA). Both monitors identify as a TV when connected to the VGA port

The monitor that identifies as a TV has no custom resolution settings and only gives me 4 resolutions to choose from whereas the one connected the to DVI port gives me full access to that.

It appears that no matter what way round I plug the monitors in or only using one monitor, it will always identify as a TV when connected via the VGA port.

I have no options for overscan, sharpness, etc for the "TV" display so I can't make it look any better than what it is and it's not very easy going on the eyes.

It's weird that putting the 9400GT (which was a previous non 1st party upgrade) works fine but the GT710 doesn't

I'm guessing that either theres something weird with Zotacs card, the latest nvidia XP drivers or the card itself is faulty