There are two ways to connect to an HDTV: VGA (analog) and HDMI or DVI (digital). These have slightly different setup options.
If you have a PC-VGA input then you would use your Native resolution (either 1920x1080 or 1366x768) as your Windows resolution.
If you use HDMI/DVI you should be set to either 1280x720 or 1920x1080.
With HDMI/DVI, you need to calibrate so that the screen fits perfectly and isn't too big or too small (in your ATI CCC).
You do NOT have a resolution limited display. Your setup is incorrect.
*Note that for VGA, you are treating your HDTV like a monitor. This means if you play a game that is 1024x768 your black bars will be there correctly. For HDMI/DVI, you choose your resolution such as 1280x720 and that is your ONLY CHOICE to choose from so non-Widescreen games will look stretched. (I believe there are ways to work around this but I don't have this setup so I can't verify.) The exception is some TV's have an HDMI-PC input which makes the HDTV work like a monitor through the HDMI input.
**Audio can be a pain through HDMI/DVI. In some cases it's not possible. My 32" Sony is hooked up to my HD5870 and audio through a VGA video cable and 3.5mm Stereo audio cable and it works just great this way (I use it for video and XBox 360 controller games).
FYI, All HDTV's take in the video signal at 60Hz. Yes, VIDEO is 30Hz (FILM is 24Hz), but except for special cases such as 24P True Cinema (Google it), your DVD Player or TV box spits out the signal to the TV at 60Hz.
The difference between a "1080i" and "1080p" HDTV is that the 1080i HDTV doesn't read a 1080p signal so any device must send High-Def at either 1080i or 720p. HDTV is either 1080i or 720p, there is no 1080p for TV.
BluRay players should be able to send out all the signals (1080p, 1080i, 720p, 480p, 480i). The BluRay disc stores the movie progressively (1080p) and all HDTV's update their screens progressively. So, if you can NOT decode 1080p, you'll decode 1080i. What does that mean? It means that the BluRay player will INTERLACE that 1080p signal, then send it to your TV as 1080i; then your HDTV will de-interlace it to 1080p again and then scale it to fit your screen (many 32" HDTV's are 1366x768).
So having an HDTV that can read 1080p is a good idea even if it only displays 1366x768 because otherwise you can get interlacing issues. Most HDTV's now are pretty good at deinterlacing.