Quality drop with HDMI?

pyr0_m4n

Honorable
Feb 4, 2013
950
0
11,360
So I went to play Far Cry 3 the other day and found out that my TV doesn't support certain video displays for the game. I'm running a DVI to VGA cable and the game just flashes black as if the resolution is out of range while the audio plays in the background. I know its not the resolution. So I switched to an HDMI cable and now the game works fine, but on my desktop, it looks grainy. Almost as if everything has an outline to it. It might just be me, but if this is a real issue, how would I fix it? When I go into the game, it looks perfect, no grain, no outlines or anything.
 
Hmm, well I know on a TV typically the pixels are larger so maybe that is why you are getting that effect. You don't lose quality by using a HDMI cable and in fact a DVI cable is able to push out the same quality as a HDMI cable can in most instances.
 
So if that's the case, since I need to buy a cable anyway, would it be better to use a DVI to HDMI or a regular HDMI cable? One other piece of info. My TV manual actually says to use a VGA not a HDMI for PC. Not sure why since they both work.
 
Hmm, that is very odd. Well I would presume that they would assume that people don't have computers that have HDMI outputs. Either way you are just fine using a HDMI cable as the quality that can be delivered is stronger. And the issue you run into with VGA is that its only as good as the cable you are using.
 
DVI is an amazing connector because it has the ability to do analog and digital of course there are different revisions of DVI where DVI-D for example can be dual link which is for screens that are above 1080p, 1200p and are for 120hz monitors as well. So to answer your question neither they are both equally as good. A normal joe DVI cable next to a HDMI cable will perform just the same. The only thing that I would say is stronger then both of those is display port.
 


Display order and configuration is a nagging problem that some manufacturers have refused to deal with.

It is highly unadvised to mix digital and analog signals, so avoid configurations which have VGA in them along with HDMI/DisplayPort/DVI-D.

If your TV is connected directly to your PC via HDMI (rather than going through an external amplifier) then it should be picked up and configured automatically without issue. If you do have an external amplifier then you may need to configure the screen resolution and overscan/underscan manually in your video driver settings.

Video drivers also tend to default to "mirroring" desktops rather than extending them. If your desktop display and TV display have different maximum resolutions, the lowest one will be used as a common denominator. If there's no common resolution, this can cause problems. To alleviate this, you must ensure that your desktop is set as the primary display, and that your TV is set as a secondary extended display.

As a side note, text will almost always look poor on a TV screen. You can disable the deblocking and filtering engines within the TV's settings, but most displays simply don't have the contrast or pixel pitch to handle fine text.