Just got a new monitor, 1080p runs at 1920x1080 and its 27" why is everything gr

uber1984

Distinguished
Jan 26, 2011
10
0
18,510
my video is grainy, am i doing something wrong? watching a bluray (sarenity (joss whedon)) and its grainy, but also ever since I got this new monitor, netflix has been very grainy as well... is it just because its 27 inches instead of my old 19 inch, and now Im seeing how bad the quality really is? and if so why is it grainy when I make the screen half the size of my screen... it looks terrible, is it the ati avivo or because im using powerdvd 10? can't be it happens in netflix too... I think it might have something to do with my ati catalyst settings? i have 2xati radeon 5770 in cross fire on 27inch 1080 viewsonic monitor... and I can't watch my show anymore because the grainy nature of them is driving me nutss... please HELP! even a link to something to read to learn myself would be wonderous...
 
Oh nvm about the border I was thinking maybe it was an HDMI problem (overscanning) that creates a 1" black border and shrinks the rest of the screen.
So no matter the video, the playback is bad? I can understand youtube b/c of crap quality to begin with, but blurays played locally as well?
 


One thing to consider here is that when you view a video that is of 480p for example, on your previous monitor, it would not need to be increased in size as much to fit in the screen, but with your new monitor, it's going to be stretched further, and to make matters worse, each pixel is larger in size make it look even worse.

With the video games, it will draw a sharp image at the new resolution size. It's only draw back it is now takes more processing power, but video shows things in it's original image quality and just stretches it to fit the screen.
 
At 27" the 1920x1080 resolution is getting a bit blocky looking due to the DPI of the screen -- 1920x1080 looks better on a 23 or 24" screen for 27" you'd need closer to 2560x1440 resolution for a crisp looking image -- what is the native resolution of the monitor ?

EDIT : Yeah that monitor isn't the best quality for the size -- Here is an excerpt fro a review :

It certainly seems squarely aimed at gamers and media playback, which is backed up by the 1920 x 1080 (1080p) native resolution at 16:9 aspect ratio. While this looks impressive on paper, it's closer to the minimum acceptable resolution than you might think due to the screen's sheer size. Lines aren't nearly as clean and crisp up close, and it's worth bearing in mind that you really need to be at least a couple of feet away from the monitor when sat at a desktop - ideally 3 or 4, which may be a stretch for those with smaller spaces.

Read more: Viewsonic VX2739wm monitor review - Pocket-lint http://www.pocket-lint.com/review/4824/viewsonic-vx2739wm-fullhd-monitor-review#ixzz1CC9rJFw6
 


I'm no expert but doesn't bluray use 1920x1080 ?
 
I would be inclined to agree with the notion that because the screen is so large that's why the image looks grainy... but when you reduce the size of the window from fullscreen to something closer to a 1/4 of the size of the monitor, it's still grainy... you would think that be reducing the stretching... it would eliminate the grainy... am I wrong?
 
I've been playing with the Advanced Quality section inside the video settings of catalyst control center, and it seems that the De-Noise and Mosquito Noise reduction are making the grainy look go away... edge enhancement makes it look terrible so I turned that off, dyamic contrast was off, but de-blocking, noise reduction, and de-noise seem to make the video much more bereable... could the quality of my DVI cable be allowing noise? Maybe its not called grainy but noisy... ?
 
I think you are experiencing an issue of viewing a low resolution video that is being stretched across a much larger screen than you used to. Imagine if you took your small screen and blew it up in size, you now would see it looking a lot blockier than before. It's not an issue with games or even at the desktop, as they just draw a higher image quality, but videos don't increase in image when blown up.
 
And you also have to remember that a 27" diagonal screen with 1920 pixels x 1080 pixels is going to have much larger pixels than a 23" diagonal screen with the same number of pixels -- so if you are fairly close to the screen you are going to see a lot more grain -- most 27" monitors would be 2560x1440 pixels so that the DPI would be the same as a 23 or 24" 1920x1080 screen.

 
I think what I have learned from this new experience is that resolution matters ^^ 1440x900 on a 19" just looks clearer than 1920x1080 on a 27"... but using these noise reduction's has helped, which doesn't make sense to me, because noise to me from my understanding should only matter with an analog signal, but this is a digital signal... so I'm have trouble understanding why increases the noise reduction matters... 101010100110 is on and off... heh
 


Figure Ideally a PC monitor will use 96DPI (at least that was the old DPI standard years back) so a 1920x1080 screen would ideally be 20" x 11.25" (which is what my 23" diagonal monitor is !) where your monitor is 23.5"x13.2" or 81-82 DPI so you are using about 14 less pixels per inch of screen space or stretching what should be a 20x11.25" screen to 23.5x13.2" which is going to be a bit blocky.

 
i think i am having a similar problem. i just bought a new monitor (BenQ g2220HD) which is 1080p. Aspect Ratio is 16:9, and screen size is 21.5". I upgraded from a Samsung 19" (SyncMaster 920NW), and now everything that has any sort of motion, for example, videos, games, or just moving around of windows makes it look very grainy, like there is a whole bunch of jittering. I found a youtube video that shows what i am experiancing, as it is impossible to capture screenshots.
Mb you are having the same problem?

http://www.youtube.com/watch?v=5FcjKY2vOtc
 
you're G2220HD has a dpi of 96 ..which means pretty good ..full HD resolutions are usually used on 23-24" screens ..and they look great..so the same res on a 21.5" screen would just look even better ..
you have 2073600 pixels on you're 220 sq in of on screen real estate ..that's 9000+ pixels per square inch ..now that won't compare to the pixel density of a phone (my HTC wildfire has relatively low res and that has 125 dpi) but you won't have ya eyes 1 foot away from the screen while watching a movie or playing a game now will you ..
 
I do not want to take over uber1984's thread, so here is a link to the thread i made over 5 hours ago, and have not had any responses on yet. http://www.tomshardware.com/forum/308186-33-display-card-interlacing-problem

but, "you're G2220HD has a dpi of 96 ..which means pretty good ..full HD resolutions are usually used on 23-24" screens ..and they look great..so the same res on a 21.5" screen would just look even better .." these were my thoughts exactly lol.
And i do not believe it has a low viewing angle, i sit about 30-36" away from my screen, so im know im not too close.
 
GPU/DXVA Decoding issues (HD5870, Catalyst 10.8 or 10.9 and up:

Ever since Catalyst 10.8 or 10.9 I can no longer apply Hardware acceleration or else my video looks HORRIBLE. If I reinstall 10.7 or disable hardware acceleration (just using the CPU) my video looks normal. It's the same thing in VLC, Boxee for Windows and MPC-HC (K-Lite). Some change they made is messing things up but I can't find information on this so I'm just avoiding hardware decoding.

Usually Hardware accelerated looks EXACTLY the same as purely software decoding.