1050 Ti and 1600p monitor

imrazor

Distinguished
I've just found a great deal on an older 30" 1600p DVI monitor. The problem is that my current gaming card is a 1050 Ti. It overclocks nicely up into the 1800Mhz range, but I think it'll still have problem pushing that many pixels. So I guess the question I need to ask is what can I do until I can afford to upgrade my GPU? Should I lower game settings to Low (mostly playing Fallout 4 ATM)? Or would it be a better idea to lower the resolution? Would 1280x800 or 1920x1200 be a better choice?
 
I have 32 1440p monitor (mainly for work) paired with a GTX 960 GPU. I am a casual gamer and when I start a new game at the monitor's native resolution,I usually play with the detail setting. If I'm not happy with the result, I'll lower the resolution. Surprisingly playing at 1080p on such a big monitor isn't as horrible as I initially had expected. Yes the image clarity is decreased but not too much and you hardly notice it during gaming.

That said if you are a hardcore gamer playing at the highest detail settings or resolutions your experience will not be the same, but seeing that you only have a 1050ti, makes me think that you aren't part of that category and you'll be fine. So I'd say go with it. Good luck.
 

imrazor

Distinguished
So I figured out how to simulate 1440p with DSR. Fallout 4 was surprisingly playable. Average FPS were about 45-50, with very infrequent dips to 35. However there was occasional stuttering. I don't have to game at 60 fps (45 fps is what I consider playable), but I can't stand stuttering. If I drop settings to Medium, I might be able to hit 60 fps and hopefully eliminate the stutter.

panathas, I know you said 1080p was tolerable on a 1440p display, but would 1280x800 with aggressive anti-aliasing scale better on a 2560x1600 monitor?
 


It depends on the monitor and how good it is at interpolating various resolutions. Some are better than others. You have to search an online review of that specific monitor. If you get that monitor, you should test it at each available resolution and come to a conclusion what works best for you (picture quality vs speed). For example when I got my monitor I thought that if I lowered my resolution to 1080p, the picture will look horrible and my best bet would be at 720p (I am a casual gamer after all and I mostly play older games). To my surprise the opposite was true. In fact I recently played a game that its max resolution was 1080p and I tried it to both of my monitors (one 1440p and one 1080p) and I saw almost no difference between them. Finally don't forget that a high res monitor will last many years and upgrades and you can always get a better GPU down the line. Good luck.
 

imrazor

Distinguished
This monitor was one of the first 30" monitors, so it's already rather old. Fortunately there are no dead pixels, and it's still blindingly bright. It's a Dell Ultrasharp W3007WFPHC 30" 2560x1600 IPS display. Not ideal for gaming with an 8ms response time, but I'm mostly playing RPGs and turn-based strategy games anyway.

Fallout 4 will run at high settings @ 40 - 45 fps if I drop from Ultra to High textures, switch AA to FXAA and turn off godrays. I'm still dropping to 35 fps in some areas, but it's still quite playable.

I tried Ultra at 1680x1050 and got 60fps in most spots, but it still occasionally dipped to 35 fps in a few complex areas. Even though the framerate was the same as 1600p @ high, it still felt smoother. However there was a persistent impression that things were slightly out of focus and blurry. And every once in a while I'd notice jaggies around a character's face. So I think I'll stick with 1600p, though I may try dropping the settings to Medium for smoother gameplay.

Gaming on a 30" monitor is quite an experience. Next paycheck I may invest in a GTX 1060 to help even out the framerate.