trifler :
ZippyPeanut :
And this reminds me of another question that's been rolling around in the back of my mind for a while: At one time it was highly recommended that you play games at the monitor's native resolution. One thing that has kept me from getting a 4k or 2560 X 1440 (or 1600) is that in some games my two overclocked 670s in SLI will not yield an acceptably high enough frame rate at such high resolutions. In such cases, I'd like to just change the resolution down to 1080, but I'm still haunted by the advice that you should play at the monitor's native resolution. I wonder if that old advice still applies to today's newest monitors..
Modern monitors are better about it, but LCD technology will always have lower image quality at non-native resolution because the pixels on an LCD are fixed. The old CRT monitor technology was able to adjust resolution on the fly because it literally drew the image on the back of the screen. My understanding is that when an LCD displays a lower resolution, it combines two or more pixels and draws it as one. Newer monitors have improved their ability to blend the colors of the combined pixels to make them look less blocky.
Nope, running 1920x1200 on a 1600p monitor is still blurry on my HP ZR30w when I had it because State of Decay (SOD) had a max resolution of 1920x1200....
trifler :
A lot of gamers say that they were able to switch off anti-aliasing with the 1440p without losing clarity, which made up for the higher resolution. I haven't tried it myself. Of course, the next time you upgrade your video card(s), it won't be an issue, at least not at 1440p.
I did notice some Anti-Aliasing does blur the edges a bit as to make it look smoother without the jagginess but turning it off would of course make it clearer and sharper but also more jaggier.
trifler :
On another note for the future, for gaming I personally would recommend 1440p at 144Hz over 2160p (4K) at 60Hz. Opinions vary, but I find most people who opt for the 2160p at 60Hz haven't tried even tried 144Hz, much less with Gsync/Freesync.
I've tried 144Hz but opted for 2160p @ 60Hz instead, don't get me wrong, 144hz is great; less tearing at higher refresh rates and less input lag too, to boot which in turns makes for a smoother gameplay - provided that your PC hardware is able to dish out at least 144fps consistently in whatever resolution that your monitor is able to display at 144Hz....
Now the reason I went straight to 2160p60, rather than 144Hz @ 1440p was because I already had a 1600p monitor so I know what it's like to play at that resolution - and 1440p is slightly less height pixels so it'd be similar, now I just want more screen estate and see what it's like to play @ native 2160p! Besides, as far as I know, there's no hardware capable of playing *any* game at this current point in time at a consistent 144fps @ 2160p resolution(otherwise I'd love a 144Hz 2160p monitor instead, however I couldn't wait for a 144Hz version to come out so pulled the trigger and got it). So I thought, yeah why might as well at least I'll be with those group of 4k/UHD'ers, heh.
bicycle_repair_man :
I have a 2560x1440 25" monitor and play most of my games at that resolution. Anti-aliasing is a non-issue as the higher resolution does indeed keep the jaggies at bay.
Perhaps to you if you don't mind or can't see it, but I can and depending on the situations, I may or may not mind having jagginess on edges. I'll pick Tomb Raider (2013) as an example. At 2560x1600, I could still see jagginess on edges, but when I turn on SSA 4x - it's all gone and looks beautiful.
bicycle_repair_man :
Playing games at a lower resolution than the monitor's native one doesn't affect my gaming experience, buy playing at a resolution with a different aspect ratio certainly would.
Both of these affect *my* gaming experience.....hahahaha, I can't stand blurriness coz the game doesn't support the monitor's native resolution....*cough* SOD *cough*