Is it worth going from 1080p to 2560x1440, considering the framerate loss?

Should I upgrade from a Full HD 1080p monitor to a 2560x1440 monitor?

  • Yes, you'll notice a huge difference at 2560x1440 even on lower settings

    Votes: 3 20.0%
  • No, the higher settings you can achieve at 1080p will make up for the lower res.

    Votes: 12 80.0%

  • Total voters
    15

TyTonusBurman

Distinguished
Jul 13, 2013
17
0
18,510
Hey everyone,

I've recently been gifted a 1080p monitor, and im having a lot of trouble deciding if i should switch it for a 2560x1440 monitor from Korea.

I've been looking at benchmarks for my ATI Radeon HD 7950 card, and i've noticed framerate losses from about 20 to 30 fps when going from 1080p to 2560x1600, and Im wondering if you guys think its worth the "upgrade".

My logic is that with a higher resolution, I can lower the quality settings and still get a better picture than I would at 1080p with ultra settings, just from having a higher res. I don't know if this is true, though.

Unfortunately, I haven't been able to find many benchmarks, but the ones I have found have been for Crysis 3, Battlefield 3, and Dirt 3. I don't usually play games THIS intense though. I play things more like Mirrors edge, Bioshock, Steam games, Minecraft, Bastion, etc.

So, what do you guys think? Should I go for a 2560x1600 monitor, or would i get a better gaming experience with 1080p?
 
If you take two monitors with the same size, one 1080p and the other 1440p, the 1440p would have better image quality. However, when you play a game, you would not notice a huge difference between the two resolutions( well, except the frame drop ). So, your 1080p monitor should provide a better gaming experience.
 
It's not worth it. There's a noticeable difference between the two resolutions, but it's not worth losing FPS or having to lower your graphical settings. I would not go above 1080 resolution with a single HD 7950.

The lower settings don't exactly 'make up' for the lower resolution either though. Only go 1440p if you can really afford it.
 
So, so far people are sayign to go with 1080p, which is different than the answers I was getting before, where people were saying that after experiencing 2560x, they'll "never go back". A few more opinions would be nice.
 
I would say Medium/High settings are better on higher res than Ultra on a lower res, but my experience is going from 900p to 1080p. Plus, you need less antialiasing.

If we can get BigMack to comment - he recently upgraded to 1440p.
 


I upgraded from 1080 to 1440 exactly a year ago and did have some time testing it on my HD 6970 before my new cards arrived. In my opinion it is not worth downgrading the visuals for the added resolution, mind you this my MY opinion on the matter and as such not up for debate.

What is not an opinion is the claim that you require less AA with a 1440p screen. While it's true that you have a higher pixel density on a 27" 2560x1440 screen that you would on a 23/24" 1920x1080 one, you'll still need 4x MSAA/4x SMAA to effectively cancel out all of the visible aliasing in most games. With 2x, some jaggies will still remain, and although they are subtle, they can still be seen with a naked eye without having to get uncomfortably close to the screen.

I can only speak from my own experience and I would not pay over $500 for a monitor so I have to turn my settings down. You either go there all the way or you don't go at all.
 


Well, the screen I would be getting would be one of the Korean monitors like the Catleap or something similar, which go for about $320 + $80-100 shipping. Either way though, in your opinion, on a single 7950, i should stick with 1080p?

 


i have a 1440p monitor and a 7950 and have no problems running near max on every single game except crysis 3 which is a mix of high medium. bf3, ultra modded skyrim, farcry 3 etc., all run at max or with one step lowered AA and i have no problems and would never go back to 1080. i paid $320 out the door for my pixel perfect shimian and $260 for my twin frozr 7950 back in January of this year.
 
I would not recommend playing on 2560x1440 with a 7950. This resolution is best experienced in a SLI or Crossfire setup, considering that's what it'll take to max games out. There's also a much larger VRAM usage at this resolution. Most of my games use over 2GB and some dancing a little above or below 3GB. I can't imagine playing without antialiasing either (jagged lines ruin the experience for me). Stick to 1920x1080 with your graphics setup.

As far as "never going back" to 1920x1080, it's really a matter of preference. For games like Battlefield 3, I prefer playing multiplayer on my 1920x1080 144Hz monitor due to the higher refresh rate (2560x1440 monitors are usually 60Hz, good luck pushing 120FPS at that resolution anyways). For games that have a crap-load of detail, I can sacrifice going to 60FPS for some amazing visuals. Once again though, it's a matter of preference.

If you decide on transitioning to 2560x1440, I would recommend a 3GB card at a minimum. I don't consider Battlefield 3 a graphically demanding game by any means, but even it uses over 2GB of VRAM on most multiplayer maps.

 
Just FYI, those "Korean IPS monitors" are typically for office and productivity uses - it can play games, but I remember reading that the response time on those range from moderate to abysmal.

 


From my experience the '1440p 3GB VRAM minimum rule' is based solely on whether you can completely max out games. Not using above 4x MSAA/SMAA/TXAA or any form of SSAA/USAA, the only game that actually runs out of 2GB VRAM at 2560x1440 is Crysis 3. Some select handful of games seem to come close, but it has more to do with those games being able to utilize as much VRAM that is available, not coming close to running out of memory.

 


My speculation is that FXAA/SMAA does a fine job at 1440p, and there's not much reason to kill your framerate or max your VRAM with MSAA...
 


Ubersampling anti-aliasing. It renders every frame several time and stacks them up on one another to completely get rid of jagged edges. It works just as well as 4x SSAA, but is just as (if not more) power hungry. Haven't personally tested its VRAM usage though as you can only find it in The Witcher 2 as far as I know.

My speculation is that FXAA/SMAA does a fine job at 1440p, and there's not much reason to kill your framerate or max your VRAM with MSAA...

FXAA and 2x SMAA are pretty much free in terms of performance and VRAM usage but FXAA should be avoided as it basically lowers your resolution somewhat by blurring the entire image (I know this is not the correct term). There are also different levels of FXAA that vary from game to game and these are usually not configurable even from .ini files let alone ingame settings. Skyrim and Tomb Raider both for instance have very potent built-in FXAA that does get rid of 90% of aliasing but causes the image to lose some of its preciseness in the process. Games like Crysis 3 and Dishonored on the other hand use lower level of FXAA which doesn't quite accomplish AA.

SMAA again, is quite new and too rarely used in games nowadays and at 4x is pretty level with 4x MSAA (although SMAA doesn't just combat aliasing though, but can be set to do other things as well). All the while developers throw in ridiculous settings such as SSAA or similar just so that the game can be marketed for 'extreme high-end PCs'. And games with really low system requirements get crap settings like FXAA, AAA, MLAA...

Funny how that works.
 
Thanks - I've never even heard of that before. How does stacking them up remove aliasing if it's the same frame? Is there any benefit over SSAA interms of quality?
 


Not an expert on the issue so don't go quoting me on that, just something I've read online. I'm not familiar with the specifics on how it works. You'd have to ask a game dev, 3D graphical designer or the like. I haven't tested this or compared it against SSAA either. All I know is that it takes some pretty massive power levels from your GPUs to run it at acceptable framerates with very few noticeable visual benefits.

In other words, I stay away from it.

 
From the sounds of it, I think I will too. MSAA + transparency MSAA is a near enough perfect solution with generally acceptable framerate hit. For more demanding games, high SMAA works very nicely with the minor drawback of some warping/fuzziness on text. What's AAA?
 
Ahaha, I'm loving all the discussion here.

Anyways, Nikoli's answer interests me. 1440p on a 7950 and you're running everything near-max? See, it's this kind of information that confuses me because the majority of other people here are saying the opposite. To be honest, I rarely ever use AA. Almost every game I play I have it set to 2x AA. Rarely, 4X. I just don't see much of a use in it. Only in the jaggiest of games do I notice the corners.

Anyways, The Korean monitors i would get have response times of around 5ms, and I'm not a huge multi-player FPS player, so it doesnt really matter to me.

Unfortunately, I have today and tomorrow to decide, because the return period on my monitor is only two weeks.. and that was 13 days ago, but yet im still conflicted. Many people say that the 7950 with 1440p is a great combo, able to run games well, and others say its nto worth it. Also, my 7950 has 3Gb of VRAM
 


No no no - supersampling and ubersampling doubles/quadruples your native resolution and scales it back down to fit your screen. For me, 4xSSAA renders the game at 2880x1800, then scales it to my (tiny and lovable) 1440x900 screen. And it looks AWESOME. I fully recommend playing Mass Effect like this...
 
You need some third party utility to enable supersampling right? No control panel option any more unless it's transparency supersampling? I remember cranking it up when playing the original Godfather (from 2006) a couple of years back. Plenty of framerates to spare so figured why not 🙂 It's true what they say though about polished turds.
 

Not sure if Nvidia took theirs out or not - it's SGSSAA. AMD has SSAA as an option in their driver and it works well. Game compatibility can be spotty, but give it a shot if you're only at 40% GPU usage to get constant 60fps.
 
Well I'm playing Hitman Absolution at the moment so I'll pass 🙂 I'll keep it in mind though. Call of Juarez Gunfighter / Gunslinger / whatever should get over 150fps on my setup (seems it's not a very demanding game) so I just might give it a spin for playing that.
 
I know well how SSAA works, just not so certain about USAA, as it's so uncommonly used.

You can force SSAA on games that don't support it natively through Nvidia control panel or AMD CCC by creating an application profile and overriding ingame settings. As jessterman stated, the compatibility may be on and off depending on the title. Make sure to turn off ingame AA before attempting this.

There are also SSAA/FXAA/SMAA injectors online. Some are for specific games, others for more general use. Unofficial and sometimes experimental stuff so compatibility varies across the board... From my experience Dishonored looked a lot better with an SMAA filter and forced 4x SSAA. No aliasing whatsoever and ran smooth 60 FPS even on a single HD 7970 at 1440p.