Using 32'' HDTV as Monitor, 1920x1080 is blurry.

SylvrrFoxx

Reputable
Nov 29, 2014
25
0
4,530
Ok, so I recently upgraded my living room TV to a 60 inch and decided to use my old 32 inch for a monitor. My problem is: 1920x1080 is extremely blurry. When I check my display settings, it says 1360x768 is the recommended display for this monitor. I'm running a Radeon R9 270 with a LD-3237 TV as the monitor. Normally, I would assume the TV just can't use that resolution, but I know it can. I play my games at 1920x1080 and when I hook my laptop to it with an HDMI cord, it shows up fine at that resolution too. My drivers are fully updated and it's run using an HDMI cord directly into the video card. Any ideas on why it won't use that resolution? My smaller 26 inch did it just fine.
 
I changed the overscan to 0% and it made it a little less blurry, but didn't quite fix it. As for another input, I have VGA on the back of the TV and DVI on the card. I don't own either cord at this time however. So to try that solution I'd need a VGA cable and a DVI adapter for the one side, correct?
 
My roommate actually has the same setup and I borrowed his cables. With the DVI, it only allows a max of 1366x768, not even the option for 1920x1080. So I guess that means this TV just can't support it?
 
Your 32" HDTV will likely accept a 1920x1080 resolution input from your graphics card, but downscale it to it's native resolution. This is likely what is causing the blurry issue. Try setting your graphics card output resolution to your HDTV's native resolution and see if that resolves the issue.

-Wolf sends
 
How exactly would I do that? I'm not too familiar with that process.

Oh and small update, I can play my games at 1920x1080, it's only when I'm not playing a game that it is blurry.
 
If your TV cannot display a 1920x1080 resolution, then your games are not displaying at 1920x1080 (regardless of what it reports). Again, the HDTV is down-scaling the resolution. You change your graphic card output resolution through catalyst control center.

-Wolf sends
 
Sorry for the stupid questions, but where exactly is that option in CCC? All I see under Desktop Properties is the Desktop Area, which is set to the native resolution. I don't see any options pertaining to my graphic card.
 


right click on the desktop >> screen resolution >> set to tv's optimal resolution using the slider.

 
right click on the desktop >> screen resolution . will give your screen the best supported rez. but you may have to drop the sync rate down to 30hz as the tv may only be 1080i (after checking the specs it is only 1080i which means your will need to drop the refresh rate to get 1080i to work.)

when you try 1080p the screen goes to the highest 60hz refresh rate and then max rez that it will support at that frequency. which is 1360x780 ish or 720p with overscan. drop it to 30hz and you should get 1920/1080 regardless of connection.
 
Still no luck finding a solution here..I feel like I'm missing something obvious. I know my monitor supports it. I launched WoW last night and when I did, the screen went black for a resolution change and it said 1920x1080 in the corner. Once in game, I changed the resolution to 1360x768 and everything got much bigger.