1920x1200 60hz vs 1920x1080 144hz

andy_93

Distinguished
Dec 11, 2009
593
0
19,010
Hi, I have had a Acer G24 monitor for a good few years now and most of the time been happy with it. It has a 1920x1200 resolution but the one big problem for me is that it has always suffered bad screen tearing. I was looking at a 4k monitor now that they have come down in price but I only have a Sapphire 7950 oc edition, so we are talking a upgrade on another card. I am happy with the card and was going to wait until the next wave of GPU`s to come out as I think that they will be more suited to 4k. So as I feel that I am not getting the best out of the card I have now as I have to run Vsync on every game I play, I was looking at a cheap side step if you will to a 1920x1080 144mhz monitor. Do you guys think there will be much difference between the two, as in picture clarity wise, as I am would be loosing pixels but gaining frame rate, or have any other suggestions that my card could run without compromising to much with higher graphical settings in games.
Many thanks
Andy
 

gridironcj

Honorable
Dec 23, 2012
215
0
10,710
Yes, definitely wait to upgrade, as AMD has their next wave coming in the first half of 2015. For a new monitor, I would highly, highly recommend a G-sync display. You'll never have to worry about screen tearing vs. V-sync stuttering ever again. However, G-sync doesn't work with AMD cards, so you'd have to wait for free-sync, which we don't know how it'll match up (since it's obviously still in development). On the Nvidia side, a pair of GTX 970s would be decent for 4K (not amazing though). Nvidia is rumored to be releasing the GM 200 based cards early next year, with a 50% increase over Titan Black.
 
G

Guest

Guest
I would pick the 1080 144 Hz. It depends on what games you play though. I personally only use it for FPS (Counter-Strike: Global Offensive) and sometimes you might feel like you have hacks on. Your image will be more fluid, so you will see your opponents SLIGHTLY before they see you. There isn't much of an advantage, but I wouldn't go back. For other games such as LoL, Skyrim or even DotA 2 you would see the difference, but you wouldn't feel the need to have it. It also depends if your card can handle it. If your card runs a game at no more than 60 FPS, it won't look better or more fluid.
 

NiCoM

Honorable
I actually just ordered a "cheap" 144hz screen on black friday, the Benq XL2411Z. I'm replacing a Acer 1080p/60hz screen.

Some of the reasons was actually also because of the screen tearing, since the screen will refresh with a smaller delay in between (over x2 times smaller), the screen tearing should be smaller, even if your card can't handle 144fps, the screen will be able to output the full frame faster, which should give a more "g-sync" like feel (not anywhere close to real g-sync), though this is just a thought.

For gaming the 16:10 aspect on a 1920x1200 resolution screen will in many cases actually give you a smaller FOV in games, kind og like if you took a 1080p image and took off a chunk of the left and right side of the image. My cousin upgraded from a 1200p to a 1080p/144hz Benq XL2720Z, and he abselutely loves his new screen. Less pixels but a better view in games (FOV) because of 1080p being 16:9 aspect ratio.

EDIT: Freesync was mentioned in an above post. It's made for laptops to alter how many hz the screen runs at, not to do a refresh everytime the screen recieves a frame. It's because in reality it's just a power saving feature on AMD laptops, lowering the hz. Only because of G-sync, AMD tries to use it as a kind of counter to that. I've looked at the VESA specs for freesync, and it only mentioned that it would alter the ammount of hz the screen runs at, but it doesn't mention it refreshing when it recieves a frame or anything like that.. I'm pretty dissapointed myself about it, but it's the truth. It will probably not work like g-sync, though it will probably give you a g-sync like feel.
 

andy_93

Distinguished
Dec 11, 2009
593
0
19,010
Thank you so much for your input guys, I have just started playing Far cry 4 and in the settings, the resolution will only go to 1080p, and I could notice the difference in all honesty, from 1200 (still get screen tearing tho), so I think I will take on board all the advice you have give me and combine it. Thinking of going for a cheapish 144mhz monitor while waiting for the next wave of GPU`s. Thanks again for all your advice, its very much appreciated.
Cheers
Andy.
 

The amount of screen tearing is completely dependent on the video card's framerate, not your monitor's refresh rate. The picture goes video card -> framebuffer -> monitor. If the video card is drawing at 90 fps, then a new picture gets drawn to the framebuffer every 1/90 sec.

  • ■If the monitor is 60 Hz, it reads the framebuffer every 1/60 sec. This means some of the frames the video card draws are never displayed.
    ■If the monitor is 144 Hz, it reads the framebuffer every 1/144 sec. This means some parts of the picture the video card draws get displayed for 2 frames.
Tearing happens when the video card is in the middle of drawing to the framebuffer when the monitor reads from it. Regardless of whether the monitor is reading at 1/60 sec or 1/144 sec intervals, the amount of tearing will be how much the image has changed in a single FPS interval - 1/90 sec. It does not depend on refresh rate.

Vsync avoids tearing by using 2 framebuffers. The video card draws into framebuffer 1, 2, 1, 2, etc. When the monitor refreshes, it reads from the framebuffer that's not currently being drawn into. This assures it's always drawing a complete frame without tearing.

Do the old style tube fluorescent lights bug you? Can you see them flickering? They flicker at 120 Hz. Most people can't see the flicker and 144 Hz is wasted on them. Others (like me) can see the flicker and might get some benefit from refresh rates higher than 60 Hz. The advantage is very tiny however - average human reaction time is about 1/6 sec, so your own body is a bigger factor. (Your peripheral vision is better at seeing the flicker than your center of vision. So you may find it easier to see the flicker if you don't look at the light.)

The reason 120 Hz displays were made were to deal with a problem unique to displaying movies on TVs. Movies are (were) shot at 24 fps. If you try to display them on a 60 Hz TV, you have to show each movie frame for 2, 3, 2, 3, etc TV frames. This creates a subtle herky-jerky motion called judder on smooth panning shots. But if your TV is 120 Hz, you can just display each movie frame for 5 TV frames, and the smooth panning shot remains smooth.

144 Hz will have smoother panning for a similar reason (it's doing more frequent samples of drawn frames, so the delay between when the frame was drawn and when it's displayed is smaller). But the difference is very slight.
 

andy_93

Distinguished
Dec 11, 2009
593
0
19,010
Well, when I started palying Far cry 4, there is a panning shot at the beginning and I do get the "herky-jerky" as you described it but when I enabled vsync, its not there but I feel that if I have to constantly have vsync enabled in games, then I am not getting the best out of the GPU that I have, that's why I was looking towards a 144hz, or even 120hz monitor. I may as well still be running my old 4780.
 

andy_93

Distinguished
Dec 11, 2009
593
0
19,010
I have something bugging me about displays. Now, the general consensus for gaming size vs resolution is that with a 1080p display is that you don't want to go above a 24" monitor. Now I have been trying to decide between a 27" 2560x1440 or a 27" 1920x1080 144mhz. At the moment I am running a Acer G24 @ 1900x1200 and I have had it for around 5 or 6 yrs now, still looks great apart from the fact I have to have Vysnc on in every game I play due to screen tearing but would like to go to a larger monitor. The thing that has been bugging me is that, before UHD, you could get a television any were from 28" up to 65" running at 1080p and the picture would still look ok, so why is it so different for monitors. The only thing I can think of is distance from the tv/monitor. Now the other question I am going to ask, I asked a bit back but got no feed back. If I were to go from a 1920x1200 60hz on a 24" to a 1920x1080 144hz on 27" monitor, would I notice much difference in picture quality as what I am loosing in pixels I am gaining in refresh rate.
 

Yup, the distance matters. We specify monitors in terms of PPI (pixels per inch), but the number that really matters is pixels per degree. That depends on both display size and viewing distance. In terms of pixels per degree, a 24" 1920x1080 monitor viewed from 3 feet away is identical to a 48" 1920x1080 TV viewed from 6 feet away. If you ever put the two monitors at those distances, and take a picture, they'll be the exact same size in the picture (i.e. without any identifying information, you wouldn't be able to tell which is the bigger screen).

20/20 vision is defined as the ability to distinguish a line pair with a separation of 1 arc-minute. If you do the math, this works out to about 286 PPI when viewing something 2 feet away. ( 2 pixels / (24 inches * tan ( 1/60 degree)) ). That's where the common target of 300 PPI for "retina" displays (300 DPI for printers) comes from. When viewed from 2 feet, a 300 PPI image is essentially "perfect" (your eye can't distinguish it from reality). Note that higher PPI is useless at 2 feet, but can be useful if you hold your phone closer to your eye. The Oculus Rift for example is basically a Galaxy Note 3 screen (5.7" 1080p) mounted about 4 inches from your eyes, and people complain the PPI is not high enough and they can see the pixels.

Decent image quality can be achieved at roughly half this - 150 PPI when viewed from 2 feet. Since most monitor viewing distances are about 3-4 feet, most monitors aim for 75-100 PPI. A 24" 1080p screen for example is 92 PPI. For general computing tasks, movies, and gaming, this is a pretty good figure. But will be insufficient for fine graphics work like photo editing.

Now the other question I am going to ask, I asked a bit back but got no feed back. If I were to go from a 1920x1200 60hz on a 24" to a 1920x1080 144hz on 27" monitor, would I notice much difference in picture quality as what I am loosing in pixels I am gaining in refresh rate.
It depends on how you're going to use the screen. If you're going to change your viewing distance so that the two monitors are about the same angle of view, then no you won't notice any difference. In fact the larger screen size of the 27" is pointless.

If your viewing distance is fixed, then yes the 27" will project a bigger image on your eye, but it will have lower PPI. Frankly, unless you're uncomfortable with your regular viewing distance for a 24" (focusing closer for extended periods can cause eyestrain), I would base the entire decision on the extra 120 vertical pixels vs the higher refresh rate. Monitor size shouldn't be a factor since you can counteract it simply by moving your seat forward/back a few inches. Leave the larger monitors for those of us with older eyes who can't focus as close anymore, and so need a physically bigger screen for us to be able to see smaller details.
 

Timaphillips

Distinguished
Apr 25, 2008
40
0
18,540
I've been playing on a 27" 144Hz for about 6 months now and it's awesome. When a game does a stable 144 it's so clear and fluid it's really hard to play at "lower" frame rates. Obviously you'll need a good GPU and occasionally turn things off from their "ultra" settings to get higher FPS but it's worth it. I had a 24" but have always preferred the 27" size. I honestly didn't ever notice any lower visual quality going from a 24" to a 27". I got that answer a lot when I was looking around also. So just put that to bed now.

Metro: 2033 Redux, iRacing, Borderlands, Assetto Corsa to name a few all run 120+ FPS for me.
 

NiCoM

Honorable


Yes i know 144hz or 120hz isn't the solution, and thats defently not what i said in my post either. ;)

i said " the screen tearing should be smaller", which it defently is.
I've bought a 144hz monitor in the meantime and tested it in my favorite extreme screen tearing game, Minecraft. The tearing is close to not visible, why? because it's gone before you even get to see it. I personally only noticed it a few times at both 60fps and 145fps when i was really trying to see it.



And for andy_93
I can recommend the Benq XL2411Z, it's a pretty nice and cheap 144hz monitor and I've been testing/trying it out for almost a month now. Tom's also recommended a cheap AOC 24'' monitor (i can't remember which of the two right now, and couldn't find it). The AOC is better for gaming, much worse for color reproduction if i remember correctly. ;)
 

andy_93

Distinguished
Dec 11, 2009
593
0
19,010
Sorry for the late reply, had man flu over Xmas :( but thanks for all your replies and thanks for the monitor recommendations, will be sure to look those up, I really do have some in depth info to go on, thanks again.