I know this old but figured I'd weigh in as years ago I went through the same thing after upgrading from a DVI-D to full HDMI and was stumped initially.
You guys are actually making a mistake and might as well just continue using the DVI-D connector instead with the way you are running it. I'll explain.
1st off, if you run your "LCD TV" as a "PC Monitor Input", like how it's run through the DVI-D connector, the PC actually treats the TV like it's a monitor and sends a signal for the TV to act accordingly. This is actually
bad as it leaves image quality on the floor and severely
LIMITS your TV
enhancement settings. I'm running a 47" Vizio for example that's a few years old and has a crazy high contrast ratio, but it goes into a different
"state" if it's getting the "I'm to act like a PC monitor" signal instead of "I'm to act like I'm displaying a Blu-Ray disc player" signal. You guys are forcing the "I'm to act like a PC Monitor" with changing it
back to "PC Input"
For instance, it always bothered me I couldn't adjust Sharpness, Color, Smooth Motion, Noise Reduction, MPEG NR, Advanced Adaptive Luma (changes black contrast to reveal better detail on dark scenes with varying degrees of choices), etc. IF the TV was acting like a "PC Monitor" via the DVI-D connection. It only had a couple really basic adjustments via the remote
IF the TV was acting like a PC monitor; like it had a really weak sharpness adjustment that didn't do much and I think a very basic color offset adjustment.
THIS is why you want to run HDMI instead of DVI-D (unless you have a really old video card w/o HDMI) as it lets you make the same adjustments to the TV while running the PC as you could
IF you were watching a TV station or a Blu-Ray movie. You are basically fooling the TV into thinking it's displaying a Blu-Ray player signal or similar. DVI-D signal vs. HDMI won't be any different quality-wise IF you just use the TV as a "PC Monitor" i.e. TV is getting the "I'm to act like a PC Monitor signal". Hope I drilled it in deep at this point with bountiful redundancy like running an HDMI cable to replace a DVI-D cable only to run the TV as a PC Monitor once more
In my situation, I
really wanted the sharpness adjustment from the "TV" settings, as well as the smooth motion option, darkness settings, and color adjustments which are
ABSENT if you run the TV set as a "PC Monitor" and I'm guessing the same applies for your sets too given the fact they behaved differently just like mine did.
1.
What you need to do 1st with the nvidia settings is manually set the TV to Native resolution, OR if you have the DSR option you can change it to (I'll use my Native resolution as the example) 1920 x 1080 "1080p" setting @ 59hz under the "UHD" option that gets created with DSR enabled drivers, and that will appear above the "Native" 1920 x 1080 @ 60hz "PC" option (just scroll up).
In the nvidia settings, Native PC option @ 60hz is NOT considered "1080p". But the TV's own indicator thinks it's 1080p. If you want it to truly be progressive, select the "UHD" setting instead of the "Native" setting.
Now, for the VERY important step all of you had missed after upgrading to the HDMI cable. Go into the Windows Control Panel. Select the "Appearances and Personalization" header (What it's called in Win7. Dunno what later Windows show but find the heading with "Display" or similar). Look for "Display" or whatever has the "Resolution" and the option to "Make Text and other items larger or smaller".
2.
Manually make SURE Window's Resolution Slider is set to Native resolution and matches nvidia setting and click apply! (if like mine, it won't say 1080p, just the actual 1920 x 1080 which is max resolution for this set)
3.
Go to the "Make Text and other items larger or smaller" settings. If text and folders are huge and the "Larger-150%" circle is checked, set it to "Medium-125%. If it's still too big, set to "smaller-100%". Conversely, if text and folders are too small, set it to Medium and see if that's large enough, or else go large. The larger your monitor and higher the resolution, the larger you'll probably want to set this.
4.
Congrats!! You can
finally CORRECTLY use an actual HDMI input (name it anything you like
other than "PC") and can now increase sharpness like you would a TV program, and any in-built options that were exclusive to the TV/Blu-Ray/PS4/Xbox/etc. settings are now able to be used with the PC and will look PROPER!!
I've found that having the sharpness slider available from TV settings available for PC games is like having an extra post processing sharpness adjustment that's much stronger than when the TV was using it's "PC Mode" slider. I can also enhance dark shadow detail, smooth motion works, and I think the backlight is fully adjustable whereas is wasn't before (been awhile, can't remember exactly) and I have full color control settings plus the noise reduction. Having the TV's sharpness slider unlocked in and of itself is more than worth the 5 minute hassle of changing the Windows settings. Game detail can now get super-crisp, and changing in-game AA or forcing it in nividia panel WILL work in conjunction with the TV's sharpness. Hell, even my video noise reduction on the TV's settings WORKS while playing a game. It smooths out overly sharp detail and removes jaggies if present depending which noise setting I adjust, but I prefer leaving it off for most crisp image.
So those of you still running your TV as a "PC Monitor" via HDMI are definitely missing out on better picture quality IF you are running it under the PC option and the TV restricts the better settings.
Word of caution. If you select DSR for your desktop, like say force 4k to be down-sampled to 1920 x 1080 by selecting the 4k option via the nvidia resolution option, it WILL make everything (text, folders) extremely tiny. Setting the "Larger-150%" option for example will help slightly, but everything will be "off". Like the arrow pointer will get huge and be a mm or 2 off where you actually pointing. This is a bug with the DSR option and Windows.
Anyhow, hope this info helps you get a better image; don't forget to increase sharpness via TV settings along with any other TV menu settings that you now have at your finger tips! When I was reading how you guys "fixed" it by switching back to a "PC" labeled input, I was cringing as you just effectively bypassed the entire benefit of running the HDMI cable in the first place, making it basically pointless. I personally couldn't tell a difference in image quality going from DVI-D (TV input) to DVI-D (video card output) vs. HDMI (TV input) to a DVI-D adapter (video card output) vs. HDMI (TV input) to mini-HDMI (video card output) with my older cards. However, HDMI to mini-HDMI with old GTX 550Ti card, and now HDMI to HDMI with new GTX 970 card, there's a HUGE difference visually being able to now run all the TV-based option settings with PC games, movies, 2D desktop, etc. vs. the old bare-bones PC options.