IzzyCraft :
If you are using composite set up you want to put it "unscaled/just scan/overscan" setting
If you are using a RGB/DVI/HDMI if you don't get one of those settings on the TV you want to match the output signal to the one the tv is tiring to use 1920x1080 or 1920x1088 as the screen size on the tv just mess around in your tv settings. Sometimes if you output 59Hz or 60Hz can matter too just mess around to get the best picture.
Right, I should be more specific.
OK, there's a problem with HDTV's: They're designed WRONG. ALL of them.
The problem was underscan/overscan on ANALOG signals for old CRT TV's. Some idiot decided to incorporate overscan into HDTV's rather than address the problem through scaling correction, even though digital signals are capable of doing 1:1 pixel-by-pixel links. That is, an HDTV could have been made like a computer screen, but it wasn't because somebody decided to mix analog's problems with digital's solutions.
End result: A 720p TV is really something like 13xx x 768 rather than 1280x720. For some graphics cards, if you set the native resolution of the TV it doesn't work due to overscan correction which shouldn't even be there, and you must instead set 720x1280 and force the graphics card to overscan.
Now, if you have something that does work natively, such as 1920x1080 on a true 1920x1080 screen, there's STILL an overscan issue so you still need to adjust that down to zero in the graphics driver. But, because the TV is trying to use something other than "real" 1080p (say, 1900x1000 with overscan), it screws up things like text.
And that's where setting the TV to NO SCALING comes into play.
And everything I just said applies only to HDMI or DVI. Things get even more screwed up when using an analog connection.