Fitting resolution into tv frame...

toneekay

Distinguished
Jun 28, 2011
564
0
19,010
Okay so this is the first time I've actually installed a DVI-HDMI cord from my GTX 550ti to my 32" HDTV. The problem is that every time I try a different resolution, the screen doesn't fit onto my TV frame. How do I fix this problem? Also, the higher resolution I go, the more the edges on everything shake and vibrate... Please help me!
 
not sure what the cause of the shaking/vibrating might be,
but it sounds to me like the fact that the TV doesn't "fit" an standard resolution is due to the TV's overscan. on many HDTV's the screen doesn't actually display the full 720p/1080p resolution, edges get cut off.
to fix this, set your PC's resolution to 1920x1080 (or 1280x720 if the TV is 720p) then use the "adjust desktop size and position" settings in the nvidia control panel to crop down the desktop to coincide with that of the viewable are on the TV screen.
 


Well, whenever I have the DVI-HDMI cable in, the native res is 1280x768 @ 29Hz. Whenever I put the DVI-VGA cable in, the native res is 1366x768 @ 60Hz.




I figured it was the overscan deal, but whenever I have the HDMI cord in, my TV doesn't allow me to auto adjust the screen. Also, the refresh rate ALWAYS drops to 29Hz and doesn't let me change it to anything above 30Hz, but whenever the VGA cord is in, it's fixed at 60Hz and runs with NO PROBLEMS.

I'd really like to run the HDMI cord and get a higher resolution, but it's so darn complicated!
 
Is your TV 720p or 1080p? It sounds as if it's only 720p/1080i (based on the "native resolution" detected by your computer on VGA), meaning that the highest possible resolution you could achieve WOULD be 1366x768 @ 60hz without switching to an interlaced refresh rate, such as 29/30hz, which in turn would degrade the picture quality considerably, and add the flickering/vibrating you mentioned. So it sounds like, with your particular setup, using the VGA cable is the best option, and there wouldn't be any real benefit to using HDMI.

Also, check your wide mode/picture size settings on your TV. Most will default to a "Normal" setting that's designed for broadcast HD content, which is what causes part of the screen to get cut off. If you switch it to a different mode, such as "Full" of "Full pixel", it will display all of the signal, without cutting anything off, and without the need to use overscan settings in your Nvidia control panel.
 
Yea, the box says it's 1080p HDTV, it's a cheap brand though (Vizio; one of the very first versions). I'm going to try the screen modes though as you mentioned to see if it helps. So far nothing has worked and I'm getting frustrated lol.

Thanks!
 


I had a similar problem. My computer is old and running XP. Was feeding a TV (see below) with a simple Y connector that was sucked down in brightness when both monitor and TV were connected. When I bought a 16:9 monitor found it was limited by the on-board graphics to 4:3. Went out and bought a GV-N62-512L card, 2 ports, VGA & DVI with a dongle to make both VGA because I was tired of seeing YouTube on my Panasonic P42X20Z with grey side-bars in 4:3 as well. Had a lot of trouble because the Panasonic was only showing about 60% of the available image, no matter how I set Windows > Desktop > Properties > Settings to any resolution, some of which caused no signal black screen on the TV, (difficult if it was set to default monitor!) or used the NVIDIA settings app bundled with the card. After some rebooting in Safe Mode and System Restore once, I downloaded the latest Driver from NVIDIA, 6.14.12.8026. Didn't make any difference until I noticed a new option in the NVIDIA Control Panel called "Run multiple display wizard". Gave it a go, and lo and behold, even though I set it as a conservative 1024x768, suddenly monitor & TV were pretty much the same and looking more like the 1920x1080 of my monitor. Changing resolution in the Wizard seemed to have no effect. Both Windows > Desktop > Properties > Settings and the the NVIDIA settings app had no knowledge of what had been done and continued to display the futile settings of before! I don't know what the Wizard software has done, but I like it. Actually, when I play a .avi file in VLC, the TV is better (shows more pic area and better brightness) than the monitor. Go figure! I will paste this around any forum I can find because I have seen a lot of posts related.
 

 
I had no problem going with a lower resolution, but anything about my native 1366x768 was horrible... I've sold the TV and picked up a really nice 27" Asus monitor that has a native resolution of 1920x1080. The picture and quality of everything is GREAT, and the movies look AWESOME!