1080 p TV HDMI to HDMI connected to pc = bad quality

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

aaXenoStrange

Distinguished
Sep 12, 2009
13
0
18,510
This is really looking like it is related to the manufacturer (Samsung) to me... Interestingly enough windows XP does look a little better, however it does indeed still suffer from the same issue. The menus just don't look quite as bad due to the very plain interfacing and how text is handled in general. I really think this looks like screen was shrunk and compressed and expanded again. This computer works fine on other smaller monitors via hdmi however my Samsung LN40B550 is simply incompatible with my ATI Sapphire Ultimate HD 4670.

Windows 7x64 is working fine with 1080 across vga with its associated usual fuzz, just horrible Picture Quality across HDMI. Turning sharpness down to <5 or so makes the picture less visually offensive to the eyes, however it is just a mask covering the real issue up. Ultimately VGA still looks significantly better with this television. I almost wonder if this is a bug related to HDCP protocol? Hey Samsung are you paying any attention to this thread? You are going to lose my brand loyalty here. I'll be grabbing an LG and throwing this TV in another room that doesn't involve a computer...

The samsung is great with any of my other sources... This is just a bit irritating after almost paying $1000 for what I honestly use 80% of the time as a computer monitor.
 

studeow

Distinguished
Apr 2, 2010
3
0
18,510
***PROBLEM SOLVED, PLEASE READ*** - and will work for anyone with the fuzzy PC text problem:
These are all awesome TV's that are actually LCD monitors, and thus there is nothing wrong with their ability to behave as such, and display crisp text or graphics at 1080p from a PC (DVI or Vga). You just have to remind your PC that the big, bad behemoth LCDTV is really just an overblown, pussycat monitor.
Here's how:

Open your nVidia or Ati control panel (or Catalyst) program and bring up the "Change Resolution" setting.
These new PC video cards aren't set to efficiently recognize all resolutions a high end display is capable of, so you need to create a custom resolution of:
1920 X 1080 @ 60 Hz with 32 bit color depth (not 24 or 30 HZ, but 60!!!!)
Apply that new resolution.

If you have difficulty selecting the color depth, then go to the basic Windows display settings (in Control Panel) and enable that resolution and 32 bit color depth (Windows display is the bottleneck to nVidia or Ati control panels), then retry making the custom resolution.

Finally - make sure the INPUT 'Edit Name' feature on all the Samsung TV's (or equivalent feature on Sony's) is set to PC or PC-DVI for the HDMI/DVI or VGA input that your PC is connected to.

Yippee!! The signal sent to the TV is in sync with the TV's capabilities to display direct pixel info and you will have a clear, gorgeous 40-55 inch Batcave quality PC monitor.


Oh yeah,
Then you can monkey around with all of the Color Correction and Brightness/Sharpness controls if you want to, but the real problem will be solved by the above method.

Enjoy guys and gals,
-Stu
 

Michael_F

Distinguished
Apr 3, 2010
2
0
18,510
Hello everyone. I had this problem and found a very easy "solution" for it: Don't use HDMI.

I bought a 37" 1080p 60Hz TV. I used a DVI-D out of my video card with a DVI -- > VGA adapter. Then, ran the VGA cable to the VGA input on the TV.

(I was originally doing the same thing, but with DVI-D to HDMI adapter on the card.)

Problem solved.

Looked like poo-poo with HDMI, looks great with VGA. I'm not sure why, and I don't really care. I had all the settings right, 1920x1080, 60Hz, etc. Why everyone is so hung up on using HDMI for PC, I don't know. VGA can produce 1920x1080, so who cares?

Also, when using HDMI, I had black borders almost an inch thick on all sides of the screen. They completely disappeared when I switched to VGA.

If you're having this problem, just try using something other than HDMI and see if it works for you. Chances are, it will!
 

studeow

Distinguished
Apr 2, 2010
3
0
18,510
Please folks - you don't need to go buying anything extra. And technically DVI is higher quality than VGA, since no DAC (digital to analog conversion) is required, and thus no signal loss! Most people will not notice any difference unless via meticulous side by side comparison (or if you're a video or graphics professional in Hollywood). However, consumers want to know they are getting best bang for buck on any product, so no need to spend extra cash to potentially lower quality to VGA output if you already have HDMI/DVI connection between your PC and LCD TV.
Like the above method listed by me, and actually all methods that have worked for folks in this thread, unbeknown to the writers (VGA cables included), you simply need to match three things (once again):

1) Windows Control Panel resolution, Hz, and color depth (Control Panel -> Display -> Display Properties (adjust res and color depth) -> Advanced button ->Monitor tab (set PnP monitor to 60 Hz)
2) ATI Catalyst or nVidia control panel - create custom resolution that matches above resolution, if it doesn't exsist, and enable it.
3) The settings on the TV, primarily the Input -> Edit Name function that tells the TV that the signal coming from the PC source is in fact a PC signal , being pumped into the HDMI/DVI (or whatever input). Trust me, the microprocessor of the TV will know to directly output the pixel information from the source without over-scanning, or processing it.

For 1080p, the matching settings for all three are: 1920 x 1080, 60HZ, 32 bit (if you want the full color range).

I have been a PC to large screen aficionado for years so I know what works - having to muck around with these things for countless hours, so you don't have to!
 

bilden

Distinguished
Apr 3, 2010
1
0
18,510
Happy Easter Eve,
This forum helped me quite a bit, in that it described a similar problem, with solutions related to a number of different products (video cards, monitor brands, etc.) I have an ATI HD5700 Series video card and an LG M237WD TV/Monitor. I experienced the same fuzzy text and black border around my monitor screen despite trying different HDMI cables, changing TV input names, etc. Worse, my ATI Catalyst Control Center (version 2010.0302.2233.40412)included no settings for scaling, which seems to have worked for some of you. I finally did away with the HDMI cable, tried connecting the regular D-sub connector to my PC (had to use D-sub to DVI adapter for the PC side). SUCCESS! Black border gone, fonts sharply rendered. Spent most of the day researching this mess. Thanks to all for your posts! Bill
 

studeow

Distinguished
Apr 2, 2010
3
0
18,510
This is because DVI from the video card sends a digital OR analog signal for graphics, depending on the display connected to it. So when you connect the PC to the TV via VGA, Windows automatically assumes you are using a standard VGA monitor, not an DVI-LCD or TV. Again the issue is with Windows Plug n Play (PnP) and how it responds to the monitor connected to it. And with LG TV's I believe you don't even have to tell the TV you are inputting into VGA - the TV will sense the analog signal! So it's a perfect connection, TV knows it's getting an analog signal from D-sub connection and displays it without processing (because it's an analog signal). NOTE: Digital displays don't have analog to digital converters, so its an easy task for the TV.
When you connect via HDMI (from DVI) on the TV, Windows detects it and video card will send out a digital signal to the TV. TV then detects a digital signal that it will process, over-scan, alter, re-size, etc etc, assuming you don't tell the TV it is looking for a simple PC digital signal (as opposed to Direct TV or Blueray or Cable). Once you set the input of the TV correctly, and set the output of the PC correctly (to 1920x1080-progressive, 60HZ) via Windows and ATI or nVidia control panels (you should check both settings), then this problem disappears and you have an unaltered digital connection between your PC and LCD TV, which has theoretical advantages (but relatively minor for most) over an analog connection. For those with Catalyst version issues, you should still be able to select resolutions and refresh rates without going into Catalyst, but by right clicking on the icon in the lower right hand tray of the desktop.

Notice that Windows Display (as well as ATI/nVidia control panel) will attempt to detect your TV specifically, and will get the brand correct, but probably not the model number for newer LCD's. You can try to download the driver for the LCD TV if it's available, or just use PnP monitor and set the parameters correctly (which is what the specific driver will attempt to do any way - no real magic involved)! You should, however, update your graphics card driver to the most current, which is good practice in any case.

There is no reason why you should be forced to use the analog (VGA) input of the TV for PC connection unless you really need it, i.e. all the HDMI ports are used up for more important equipment, or you don't want to change a few settings and happen to have a spare DVI-A to VGA cable laying around.

Happy Easter!
 
G

Guest

Guest
Thanks for hints in this thread. I can confirm it worked for me with Samsung LE40C650 and Radeon 4670. I just want to remind that BOTH tricks have to be used - setting name of HDMI1/DVI to "PC" and then setting overscan value at ATI Catalyst Control Center to zero.

Someone above said the colors turn bad when you set the name of HDMI to PC or DVI PC. That's right - the colors are set to the same grayish tones as you get if you use VGA cable. But this can be fixed in TV settings by setting HDMI black level back to LOW where it was before the change of name (it resets itself when you change name of HDMI input).
 

MST

Distinguished
Apr 5, 2010
1
0
18,510
Thanks for hints in this thread. I can confirm it worked for me with Samsung LE40C650 and Radeon 4670. I just want to remind that BOTH tricks have to be used - setting name of HDMI1/DVI to "PC" and then setting overscan value at ATI Catalyst Control Center to zero.

Someone above said the colors turn bad when you set the name of HDMI to PC or DVI PC. That's right - the colors are set to the same grayish tones as you get if you use VGA cable. But this can be fixed in TV settings by setting HDMI black level back to LOW where it was before the change of name (it resets itself when you change name of HDMI input).


I have the same p;roblem with 2333HD when I pluged HDMI to PC using DVI adapter that was attached to MSI 250 NVIDIA card.

Is there any solotion to that particular screen? Overall looks nice beside that one problem.

Thank You
 

jahncm

Distinguished
Apr 1, 2010
13
0
18,520
This is crazy. I have the same issue... different Monitor type. Exact same issue.

This seems to be the behavior of PC using HDMI (and/or DVI>HDMI converter) to TV. No matter what specific Brand names. The only ones that don't seem to be consistantly affected are some of the Sonys, but probably because they are recognized as their monitor bretheren in Windows.

I am using a 32" native 1080P Vizio TV as the Monitor (SV320XVT)
I am using an ATI Radeon HD5850 and The reolution is set to 1080P @ 60 Hz on my Video Card.
I am using Windows 7 64 Bit

Over HDMI...Resolution is horrible with a Black Border.

Also attempted with an NVidia card...same result. Different HDMI cables...different cards same result.

Switch to VGA....beautiful. On both.

There is no option to perform the PC gimmick on the HDMI Port, but I will try changing the overscan options on the video card tonight.

Personally...this just sounds like the TV/Video Panel manufacturers want to sell more expensive Display Port technology, lock us into vga or have us buy more expensive "monitors" even though they are the same underlying technology .... and the Video Card Manufacturers are in bed with these guys...especially if all of this is a simple setting that could have easily been detected in the Video Card Driver.

What gives? Any more concrete resolutions? Is this something that should get on a watchdog list?

 

jahncm

Distinguished
Apr 1, 2010
13
0
18,520
@JMadge-
I agree. This is no way resolved and I think we've stepped into something bigger. Either this is a "buy a monitor" scheme or some terrible coding and/or hardware detection on the part of the TV and Video Card manufacturers, as well as MS.

Imagine...We are all here...And this is not the only forum, where people are reporting this issue. There are several people returning PCs, Video Cards and TVs that are just fine...and just because of this very issue. Why are we given an HDMI output on our new high end Video Cards if they don't want us to make proper use of it? Absurd.

@studeow-
Nope. Your solution does not work for all Bravias and Samsungs or 3rd part dispaly panels with the exact same issue. Definitely not on this Vizio, although I do not believe this to be a Vizio issue.

Options for ATI Catalyst don't exist as you describe. No slider. I was unable to select anything for overscaling. I still have the Black Border when I use the HDMI, but VGA is perfect.

The resolution is set to 1080P or 1920x1080 @60Hz in True Color (32 bit) in both the Windows Control Panel and the Catalyst Control Center, so not sure why we would need to do anything else here. Resolution on the display input iof the TV indicates it is coming in at full 1080P I have turned off all the smooth motion and effects the TV applies...Nada.

If I'm missing something, can you provide a step by step process?

 

ndeezler

Distinguished
Mar 31, 2010
10
1
18,510
I believe I've found the HDMI solution. If you are using HDMI to HDMI, go to the Pixel Format tab in Catalyst Control Center, choose RGB 4:4:4: Pixel Format PC Standard (Full RGB). Then go to the Scaling Options tab and set it to 0%. Voila! Crystal clear with no black borders. Worked for me anyway.


These options can be found in the CCC by clicking on Graphics, choose Desktops and Displays, right click on the little monitor at the bottom, choose configure, pick the Pixel Format/Scaling Options tabs.


By the way, I'm using a Vizio VM230XVT television and it's now working great.
 

jahncm

Distinguished
Apr 1, 2010
13
0
18,520
@ndeezler...

Thanks... I will try that when I get home. I looked all over for "Scaling options" in CCC and saw none, so maybe this navigation will help.
 

Nakoma

Distinguished
Apr 11, 2010
1
0
18,510
Ok,

I have a samsung LE40B550 LCD and a radeon 5770 which I have always used a standard VGA cable to connect the 2.

I disagree with the person that says just deal with it and use VGA cable instead of HDMI, mainly because i want to get a new surround sound amp and have the audio going through the HDMI cable rather than having a seperate optical cable and having to select a different input everytime i wanted to change from sky / pc.

Had all the issues described in this thread when connected by HDMI pixellated view, harsh looking text, too bright, didn't fit the screen properly etc.

Basically connected the HDMI cable to input 3 or the one labelled HDMI/DVI instead of input 2 which it was in before. Set the resolution to 1920 x 1080 / 32bit / 60Hz. Then edited the input name to PC-DVI and then it looks just as perfect if not better than when connected by VGA cable.
 
G

Guest

Guest
Hi,

I've just found this thread today.

I'm having similar issues connecting an Asus 1201n to a Samsung P2270hd or Samsung LE40B651.

While using the VGA-VGA connection I get good quality on both. If I change to an HDMI-HDMI image is fine until 1680x??? and ruined @ Full HD.

I'll be testing some of the solutions encountered and I'll post my results later on (It might take a while since I'll only be near booth screens next week).

If changing the input name trick works out... I think I'll be complaining to Samsung because it makes absolutely no sense.

Best regards,

Nuno
 
G

Guest

Guest
I have a Samsung SyncMaster 2494HS and was having the 'black border' issue as well as very poor quality (fuzzy text, and what not). After trying many things, was going through my screen settings and saw something called "AV Mode".
I set that to "Off" and bam! 100% quality :)
 
G

Guest

Guest
If you have a Samasung HD TV then you'll probably have to connect your hdmi cable to the "HDMI/DVI" port...normal "HDMI" port didn't solve the problem for me.
And if you need sound to pass trough the cable you'll need to rename the connection on your tv to "PC" and not "DVI PC"

hope it help a bit more for those who still had problems after reading this thread, like I did
 

SteveO884

Distinguished
May 8, 2010
1
0
18,510
I would just like to say a huge thanks to everyone who has offered up there suggestions and advice on this thread, it has been incredibly helpful. So I thought I would also offer up how I used advice on this thread to solve the problem for me.

This morning I took delivery of my new Samsung LE32B541 LCD TV. Firstly I installed the TV and everything to get it up and running then hooked it up to my pc (HDMI on GPU - HDMI port 1 on TV). Plug and play worked fine but as a lot of people are experiencing the text was really hard to read and not displayed correctly and I also had 1/2" black borders all the way around the display. Firstly I checked to see that everything was displaying at full HD which it was so then went to the ati website to update to the latest drivers for my ATI Radeon HD4600. Installed latest drivers and then did a reboot as requested.
Once the pc booted back up the text and picture quality was perfect, unbelievably sharp, but still had black borders.
I opened up the Catalyst Control Center and followed another persons advice on this thread (forgive me for not crediting you but I cant remember who posted it)
1: In top left corner from the drop down menu choose "Desktop & Displays"
2: In the pane under "Please select a display" should be a thumbnail with a monitor and a 1 in a blue circle, right click and choose "configure"
3: The new heading in red will be "DTV (HDMI) 2" with a selection of tabs under it - select "Scaling options"
4: There will be a slider that is set between "Underscan" at 15% and "Overscan" at 0% (mine by default was right in the middle) I slid this to 0 and voila my display fills my screen with perfect HD clarity.
5: Tick the box at the bottom of that page to stop the display reverting back to borders everytime you turn your system on.

These are the steps that worked for me thanks to some sound advice on this thread. Hope it helps others and Im sorry I cannot over support to those using Nvidia.
Again my specs are Samsung LE32B541 and ATI Radeon HD4600 with HDMI to HDMI (also worked with DVI to HDMI)
Thanks
 
G

Guest

Guest
Thank you very much for suggestions!

This is completely Samsung's fault unfortunately. I had the same problem with my LE40B530P7WXRU LCD TV. I Tried all settings, all ports, updated all drivers, even tried updating TV firmware...

Finally I found this topic! The solution was mentioned several times here in previous posts:

On TV remote control push source button, choose your PC HDMI channel, push tools button and "rename" channel to "PC" or "PC DVI".

The problem SOLVED!

:fou: :fou: :fou: SAMSUNG :fou: :fou: :fou: You used word "rename" instead of "change signal type" or something like this. And you put it as deep in menues as possible! You sadists!

Pfff I am happy now :hello:
 

siraf

Distinguished
May 17, 2010
1
0
18,510
I have to agree with everyone else using Samsung LCDTV's and have solved their problem with the 'renaming' trick.

I had a Core i5 system (win 7) with radeon 5770 and Samsung LA32R81B LCDTV (HDMI). I must say this was bugging me for months! The image quality was not as clear/crisp as it is on :
- PC to Samsung LCD Monitor (DVI to DVI)
- PC to Samsung LCD TV (VGA to VGA)

My tips to any other user that have the same problem are :
1. First, check your Samsung LCD TV's HDMI/DVI input port for PC. In my case, the manual says it was HDMI 2 if I were to use HDMI/DVI to my PC.
2. ONLY use that port for HDMI/DVI connection from the PC to your LCD TV.
3. Next, you have to do that 'trick' of renaming the Source List. Here's how :

For my tv, it was : "Menu > Input > Edit Name > HDMI 2". Then choose 'PC' (Mine was not written PC-DVI or PC/DVI like other Samsung owner's. It's just PC)

4. Once that is set, go to "Menu > Input > Source List". And choose HDMI 2. It should look like this "HDMI2 : PC"
5. Your PC will auto detect this and chooses 1360x768 resolution @ 60Hz instantly!! If it doesn't plug-unplug your HDMI cable to you gfx card again.

* Due to my ignorance, my previous trials didn't bring any luck bcoz my thought was that any HDMI port works for my TV. Turned out that my TV only works for HDMI2 port for PC connection thru HDMI. Reading the service manual helps!

Thanks a bunch to those that guided me in the posts above. Yay!

Good Luck! :D
siraf
Malaysia
 
G

Guest

Guest
Hi Everybody!

I also have the same problem with a Philips 42PFL5604 TV. I have a Ati 4850 + W7 + i7 + 6gb Ram.

I've tried all the posted solutions and the best images quality is at 1360x768 60hz. Both using VGA or a DVI-HDMI cable.

I could get the 1080p resolution with all the posted modifications but the image is not as good as with 1360x768 60hz.

Now my contribution to the cause:

1.- The TV user manual indicates the following highest resolutions:

1.1. - "Computer formats HDMI: 1360x768 60"
1.2. - "VGA: 1360x768 60"
1.3. - "Video formats: 1920x1080p 24, 25, 30, 50, 60"

Important note: Do you notice the small "p" letter after 1080... that is not for pixel, it's for progressive scan ("i" would be for interlaced).

2. - There is an option to display the resolution the TV is currently working

2.1. - If I'm using the computer it display 1360x768 60hz
2.2. - If I'm using a PS3 (that really looks grate) it display 1080p

Important note 2: 1360x768 != 720p as 1920x1080 != 1080p

3. - If you try using the infamous "Windows 7 ClearType Text Tuner" with a resolution set to 1920x1080 60hz you'll get a message telling you that you are not working at the native resolution

4. - Final notes:

4.1. - Its not an HDCP problem as somebody posted because the ATI 4850 has HDCP support at the GPU.
4.2. - Changing the name does not solve the problem it would only change the default TV settings.
4.3. - If you use 1360x768 you don't need to set the overscan setting as the TV manages to map all the pixels to a relation of 1:n. So Changing overscan to 0% would only change the way you persives the image but pixel mapping won't be 1:1

So my questions are:

a. - What is different between a computer and, for example, a PS3 video signal? In the same words as Philip's people would be: What is the difference between a "Computer formats" and a "Video Format"?
b. - Is there a way I could force my computer to work in 1080p 60hz? (I'm not saying 1920x1080 60hz)
c. - Should I call a lawyer?

Thanks to all of you!
Good luck!

Andres
 

eyagos

Distinguished
May 25, 2010
1
0
18,510
Hi there,

I have the same problem when plugin my BENQ monitor to my ATI card trough HDMI. It is not very "hard" in my case, but is suficient to not use HDMI, because text is not very readable (not in all regions of the screen, btw).

I have used a program that test the image (CheckeMON), and these are the results:

- When pluged trough D-Sub, everything is ok.

- When pluged trough HDMI, the test that fails is the "Out of phase test"


This means something like: The card and the monitor are out of phase (nothing about resolution or fequency), and the comunication is not well.

May be this issue related with the driver of the card?


Santiago.
 

fx2236

Distinguished
Nov 18, 2008
36
0
18,530
Hi , i used to connect to my samsung 40b550 LCD TV with my Nvidia GTX260sp216sc using HDMI and the pic quality was incredible , i recently bought a HP DV-6 laptop which got nvidia 230GT gpu , ad im using the same codec/settings as my pc and the pic quality is terrible , all pixelated like a low bit-rate movie , anyone got any idea why is that ? , btw i did a test on a bluray movie and the pc pic quality was great and the laptop is crappy . thx
 

jithint

Distinguished
Aug 16, 2009
61
0
18,640
It seems for PC DVI->DVI/HDMI adapter->HDMI cable to TV, you must use HDMI1 port. I got this from the Samsung User Manual:

"Use the HDMI IN 1(DVI) jack for DVI connection to an external device. Use a DVI to HDMI cable or DVI-HDMI adapter (DVI to HDMI) for video connection and the PC/DVI AUDIO IN jacks for audio. When using an HDMI / DVI cable connection, you must use the HDMI IN 1(DVI) jack."


Hope this helps
 

fx2236

Distinguished
Nov 18, 2008
36
0
18,530
actually i have read this and tried it but , the problem still exists , also my laptop has an HDMI port , so im connecting it to the TV using an HDMI cable , but whenevr i use the VGA port the pic quality is excellent, but i want to connect my laptop using HDMI not VGA
 
Status
Not open for further replies.