HDMI Versus DVI Quality Issue

Status
Not open for further replies.

roxranger

Distinguished
Oct 6, 2009
10
0
18,510
I'm stumped and hope someone here can help me out. I've been trying to use an HDMI connection from my PC's ATI HD 4670 card to two different monitors: a Samsung SyncMaster P2770 (27") and a ViewSonic VX2433wm (23.6"). Both monitors have a native resolution of 1920 x 1080 and both display very well using a DVI to DVI connection.

When I try to use an HDMI to HDMI connection on either monitor, the results are awful. Both underscan, until I adjust them using the Catalyst Control Center, and both display everything fuzzily.

I'd like to use HDMI if I can and, from what I've read, the quality should be equal. But it simply isn't. In fact, HDMI is unusable on both monitors.

Can someone give me some pointers on how to get good HDMI resolution on a computer monitor?

Thanks in advance!
 
Solution
I connected a Sony Bravia HDTV (32 inch) to my Radeon HD5770 via HDMI alongside my DVI-D connected LG E2250V monitor, I get better video/graphics on Sony TV via HDMI. The colors are more vibrant. Both TV and monitor have the same resolutions 1920 by 1080. This may be due to my color adjusments. But the fact is that the two connections work very well together. I have Ati CCC10.6 on Win 7 x64.

roxranger

Distinguished
Oct 6, 2009
10
0
18,510
There is if you're trying to connect two PCs to one monitor and the monitor can only accept DVI and HDMI (no VGA on the Samsung which will replace the ViewSonic). So, unless I can get the HDMI-HDMI connection working properly, I will have to get a HDMI to DVI cable or get rid of the monitor. And, my eyes don't lie, BTW, there may be no theoretical difference, but there is in real life.
 


how are u going to connect 2 PC's to one monitor? im lost..

Sir, it is "virtually" impossible to notice a visual difference when comparing HDMI to DVI. DVI and HDMI are exactly the same as one another, image-quality-wise. The principal differences are that HDMI carries audio as well as video, and uses a different type of connector, but both use the same encoding scheme, and that's why a DVI source can be connected to an HDMI monitor, or vice versa, with a DVI/HDMI cable, with no intervening converter box.

So unless you are carrying audio from your PC to the screen you are wasting your time with the HDMI cable. Don't complicate yourself, just use both DVI cables and get done with it. If your card came with the adapter then use that, if not order one like Timop suggested.
 

roxranger

Distinguished
Oct 6, 2009
10
0
18,510


I'll try that when I reconnect the HDMI cable as a test. At the moment, I'm using DVI on the Samsung and the "Auto" button is disabled. I've gone ahead and ordered a DVI - HDMI cable on the chance that the monitor can be "fooled" into thinking that the HDMI signal coming from the video card is really DVI (or vice versa).

Thanks for the input.
 

leon2006

Distinguished
There is no difference in quality between DVI and HDMI.

Personally i prefer HDMI as it carry video/audio in one cable.

To change the setting of your video card (when connected using HDMI or DVI)

Start CCC
Desktop Properties
The top box will display the current resolution & setting of your dispplay

The lower box details the display settings that you can use (i.e. 1080P) that you can use for your display. Select 1080P and click OK.

Another option: using Desktop Display Properties
You will see 2 TV icons or 1 if you have 1 display
Right click the icon and select configure
Select HDTV support
Select 1080P60HZPAL

~

If you have 4890 with 2 display all connected using HDMI. It should work for you.

Another thing use ATI provided DVI to HDMI converter.



 

suat

Distinguished
Dec 17, 2009
851
0
19,060
What version HDMI cable are you using ? If your cable is of version 1.0 and you are trying to transmit rich video, then you might get blurred/fuzzy output on the screen. Try HDMI cable of version equal to or greater than 1.3a.

Another thing is, DVI has two flavors: Analog and digital. DVI-Digital is the same as HDMI. DVI-Analog is different.
 

roxranger

Distinguished
Oct 6, 2009
10
0
18,510
The cable, which is brand new, is 1.3b. So, that shouldn't be an issue. I'm going to try a DVI-HDMI connection (when I get that cable) and see what happens. I'll get back to the forum to give a follow up.

I'd rather not use HDMI at all, BTW, but because this monitor doesn't support VGA, I have to (if I want to keep it). I plan to connect the monitor to two PCs (as I have with other monitors using VGA and DVI for a number of years). One of the computers uses Windows 7 and the other uses Linux.
 

leon2006

Distinguished
I run ubunto, Windows Vista, and XP on my computer with 2 display (70" HDTV and 24 inch LCD). Both are connected with HDMI cable. I use the ATI made DVI-> HDMI converter. My card is CF 4890. You should run at 1080P.

Make sure your video card indentify the model of the LCD on the specific dvi port connected.
 

suat

Distinguished
Dec 17, 2009
851
0
19,060
I connected a Sony Bravia HDTV (32 inch) to my Radeon HD5770 via HDMI alongside my DVI-D connected LG E2250V monitor, I get better video/graphics on Sony TV via HDMI. The colors are more vibrant. Both TV and monitor have the same resolutions 1920 by 1080. This may be due to my color adjusments. But the fact is that the two connections work very well together. I have Ati CCC10.6 on Win 7 x64.
 
Solution

roxranger

Distinguished
Oct 6, 2009
10
0
18,510
Well I solved my own issue, but it took a bit of time to figure it out. In the end it was very easy. I've seen a lot of posts on the web about this kind of issue, but there's never a clear answer. So, through the process of elimination, I came up with the following.

The problem had nothing to do with DVI-HDMI or HDMI-HDMI. It was all about the screen preferences on the Samsung P2770H monitor. To get HDMI to work, I simply needed to go into Menu - Size & Position - Image Size and select "Screen Fit". For some reason, the HDMI set up of the monitor requires a specific "screen fit" setting (unlike DVI). Now HDMI and DVI are value equal.

I'm sure other monitors have a similar setting, and you can use that to fix issues you may be having with HDMI on a PC.
 

roxranger

Distinguished
Oct 6, 2009
10
0
18,510
Fascinating. You can't select your own answer as the best one, so I randomly selected the one above mine so that I wouldn't be haunted by e-mail to vote. I fail to see the value added aspect of this.
 

TomH123

Distinguished
Dec 11, 2011
1
0
18,510
I'm just +1'ing roxranger and adding a caveat. The complete solution also involves his post about adjusting the overscan. In CCC / My digital flat panels / Scaling options, change the slider to 0%.
 
Status
Not open for further replies.