Question Is the jump from 1080p to 1440p really worth is now that I can run it reliably?

Nov 9, 2020
61
6
45
3
i've upgraded pretty much everything in my PC over the past two years and now I'm running a 2700x, EVGA 1080ti SC2, 32GB 3200 RAM. I've had two matching monitors for awhile now; they are both Samsung curved 27inch monitors. One of them is 60hz for discord and whatever else I have going on the side. The other is 144Hz 1ms and used mostly for games.

After coming from a 1060 previously, I feel like I can decently play games on 1440p now on more moderate framerates. I would love it if i could come close to matching my 144hz 27 inch curved monitor (Used for competitive games) but I'm not sure if there is anything like that out there by Samsung.

I don't have the most money so I was debating whether I should just get something cheaper like a 75Hz monitor 1440p if that is even a thing. It would be mostly used for visually intense story games that don't rely heavily on FPS. Of course I would have to save up for something like this so my budget would only be around 300-350 bucks.

But the real question here is.... can you even really tell the difference between the two that well at just 27inches? Would I be better off getting a 4k 60hz-75Hz monitor?
 

Colif

Win 10 Master
Moderator
I just swapped from a 4k 60 to a 1440p 144 and although my GPU (RTX 2070S) can only max out at 100 in games, it gives me room to move and maybe next GPU can run it at 144. the main difference i noticed is how smooth mouse is now. it runs desktop at 143.996 according to a website which is close enough for me.

4k is double the screen pixel count of a 2k, so you might be slowing the refresh rate but you not reducing the work GPU has to do in the same amount of time. Its main reason I swapped to 2k as I figured its less work for GPU... then i doubled refresh rate of model to even it out.

4k text is really hard to read on desktop, its why windows 10 defaults to 150 scale for desktop meaning you looking at 1440p most of the time anyway. I just stopped fighting and let panel run at native resolution. Some apps are dumb and notice you have 4k and display at that size. The number of those apps is small luckily. Logitech Gaming Software was one such app and it meant a microscope was needed to read its menus.
 
Last edited:
At 27” I always regretted my old 1080p monitor. While it was ok for gaming anything with text was just not sharp and I found it annoying. I went 1440p 27” a year ago and it’s a massive difference for text or working in Excel. It’s also a noticeable improvement for games but not as great as viewing text. However 1440p 144Hz is very gpu heavy. I originally had a 2080 Super and was underwhelmed with performance. In some games to run 120fps I needed medium settings with RT off. I was lucky and got a 3080 on release and have been much happier. Even harder to run games can do over 120fps at high/ultra settings with RT on and that’s a big improvement.
 

Colif

Win 10 Master
Moderator
My PC only 3 months old, I can't justify spending same amount I paid for a 2070S so soon. I can wait and know whatever I buy next time will be able to run it better. Its same reason I don't want to run 5ghz CPU, it leaves me no way to improve on next CPU.
 
Understandable. A lot comes down to expectations and the games you want to run. I was aiming for a higher game settings and FPS than my original gpu could do in some games. However if ok compromising on settings and FPS in those games it could do a good job.
 

ASK THE COMMUNITY

TRENDING THREADS