So, I've always used analog outputs on my video cards. Every time I've tried connecting a computer through HDMI (with different TVs, AND computers), I get black borders around the screen, horrible colors, and horrible aliasing on absolutely everything.
I just installed my new R9 290X which has no analog outputs, so I'm using HDMI atm. The newer Catalyst Control Center has more settings for TVs than my old one did, so I was able to fill the screen and get rid of the borders.... but everything is still aliased and the colors are overly bright and saturated. It is basically excruciating trying to read any sort of text now and the screen is just hard to look at, period.
I tried re-adjusting ClearType text but it didn't help with aliasing on text at all. I just want to check here and see if there is some sort of setting I'm missing or what? Any solutions or previous experiences would be greatly appreciated.
If this is all HDMI has to offer, I will be buying a DisplayPort to DVI-I adapter and use the good old trusty VGA input on my TV.
The TV is a 46" Insignia LED 1080p 120hz, and has always worked perfect through VGA.
Windows, Catalyst, and the TV all say 1080p @ 60hz, so I'm not sure what I'm missing here, or why the desktop requires upscaling to fill the screen.
I just installed my new R9 290X which has no analog outputs, so I'm using HDMI atm. The newer Catalyst Control Center has more settings for TVs than my old one did, so I was able to fill the screen and get rid of the borders.... but everything is still aliased and the colors are overly bright and saturated. It is basically excruciating trying to read any sort of text now and the screen is just hard to look at, period.
I tried re-adjusting ClearType text but it didn't help with aliasing on text at all. I just want to check here and see if there is some sort of setting I'm missing or what? Any solutions or previous experiences would be greatly appreciated.
If this is all HDMI has to offer, I will be buying a DisplayPort to DVI-I adapter and use the good old trusty VGA input on my TV.
The TV is a 46" Insignia LED 1080p 120hz, and has always worked perfect through VGA.
Windows, Catalyst, and the TV all say 1080p @ 60hz, so I'm not sure what I'm missing here, or why the desktop requires upscaling to fill the screen.