Overscan issue for 1080p monitor

Sudheer singh

Reputable
Oct 20, 2014
34
0
4,530
Hello
I have some overscan issues in my monitor screen S2240L with HDMI connection.But when i connected it via VGA works fine.This is all with my new graphics card R7 250.Finally solved by setting overscan to zero in CCC.But i doubt it affected the screen quality.And my query here is since my monitor has only VGA and HDMI port, if i connect via DVI-HDMI cable will this overscan issue exist??And if i buy a good quality short VGA cable,does it matches the quality of a HDMI/DVI cable??
 
Solution
I have been out for a while, but while fields do refresh 2 times per frame(so 60 for 30fps) I would not call them half a pixel. they are actually every second line(even and odd). I have done lots of work with video.

I have never noticed any quality loss with over/underscan on normal screens. If the screen has bad(too much) built in scaling(SyncMaster 245T I am looking at you. It makes HDMI un-usable for a computer.) it may happen, but it should not be a problem for any monitor or modern TV.

hjj174

Honorable
Feb 8, 2014
315
0
10,960
I'm pretty sure AMD cards always do the overscan thing when you first use them. VGA can support up to 1080p while HDMI can support even higher resolutions. Using HDMI is preferred since it uses a digital signal unlike VGA which uses analog.
 
With the overscan/under corrected in your video card drivers, most screens will look as good as they will.

In the rare cases of quality loss it would be very noticeable and you would want to rip your eyes out.

I would take a good guess you are fine with adjusted HDMI.

Now onto the DVI - > HDMI question. It depends on the monitor and video cards setup. I actually use a DVI -> HDMI cable because my video card has a mini hdmi connector that seems like it would not be strong. My video card still knows it is a HDMI device and this offers overscan options. A native DVI screen does not have nor need these options.

The option only exists because most TV's overscan to remove any garbage from the edges of the video(this in it self should not be needed anymore since digital streams do not seem to have this issue.).

With a quality cable and ADC in the monitor VGA can look very close to DVI/HDMI/DP, but text may still be slightly less sharp. Since part of the quality lies within the monitor all screens may be a bit different.

My Syncmaster 245T is on of those screens that HDMI looks awful on(because it expects HDMI to only be used for TV and thus has awful overscan[fine for TV i guess] not found on 99% of computer screens.). In this case VGA is MUCH better looking than HDMI, but VGA is just a tiny bit sharper on text. Games/videos would be hard to see any difference because it is that close.

In the end one of the biggest down sides to VGA is a total lack of copy protection. This means bluray and some other high definition content may not work at full resolution or at all in some cases. Any video without this protection requirement will work fine.

Add this to relying on the screens analog to digital converter quality makes recommending VGA hard these days(not that it is actually that bad on the right hardware).

EDIT.

Saw another post has been made
hjj1746, VGA can support over 1920 x 1080(I think it maxes at 2048x1536[not bad for such an old standard], but you need one hell of a cable card and screen). HDMI requires later versions that use a higher clock rate to allow passing 1920 x 1200 without reducing the refresh rate. With 1080p(thanks for the naming convention Hollywood. Computer screens have been progressive since the 80s[or earlier] after all the p is kind of redundant) being the "standard" I do not think it will matter too much in this case.

Still agree that digital is the way to go.
 

Sudheer singh

Reputable
Oct 20, 2014
34
0
4,530
Thank you nukemaster for your valuable input.In my setup,overscan is not done by cropping the edges of the window and zooming.It is just zooming in without doing a crop.screen quailty is good as it is.But someehere i had read that overscan is done by losing some pixels and thus affects the visual quailty.
As u said if i connect monitor to PC via DVI-HDMI cable,does the video card recognize it as an HDMI device??Is their any advantage of using this DVI-HDMI cable to avoid this overscan issue???
 

hjj174

Honorable
Feb 8, 2014
315
0
10,960

I'm thinking the reasoning behind the p is because there are also 1080i TVs/Monitors that use fields instead of pixels. Fields being the equivalent of half pixels means that the FPS is actually half of a 1080p monitor. A 1080i screen at 30 FPS, which is typical, has approximately the same pixel rate as a 720p at 60FPS, remembering of course that fields are a half of a pixel.
 

hjj174

Honorable
Feb 8, 2014
315
0
10,960


Not that I have noticed, it just appears to make the viewable area on screen smaller at first but at the same resolution. Then when you turn overscan up, it corrects it to the right size. I could be wrong but in most applications of it, there won't really be any loss of quality.
 

Sudheer singh

Reputable
Oct 20, 2014
34
0
4,530
I too didn't find any loss in quality.But somewhere i read that overscan is done by cropping the edges of the actual screen and make it zoom so that the pictures gets bigger.Here CCC is actually not cropping,but just zooming in to fit the monitor screen.Resolution is same before and after overscan.
 
I have been out for a while, but while fields do refresh 2 times per frame(so 60 for 30fps) I would not call them half a pixel. they are actually every second line(even and odd). I have done lots of work with video.

I have never noticed any quality loss with over/underscan on normal screens. If the screen has bad(too much) built in scaling(SyncMaster 245T I am looking at you. It makes HDMI un-usable for a computer.) it may happen, but it should not be a problem for any monitor or modern TV.
 
Solution