Ultrawide or 4k for gaming?

LucasMV

Prominent
Apr 14, 2017
16
0
510
These are the options:
-1440p 144hz G-sync (Acer XB271HU 27")
or
-4k 60hz G-sync (Acer Predator XB271HK 27")

So it was resolution vs Hz, and then I thought, why not have the best of both worlds? Lower the res a bit to get higher Hz and lower the Hz a bit to get higher res.

34" Ultrawide 3440x1440 100Hz G-sync.
-ASUS ROG Swift PG348Q
or
-Acer x34

It seems like 34" UW 1440p res is not noticeably worst than 27" 4k, so res is not very compromised, and I get higher refresh rates in exchange, although not 144hz but 100hz g-sync is still great.

I'd be playing with a 1080 ti. Mainly play Sandbox games, FPS not so much.

My concern with UW is that some games doesn't support it.

What should I choose?
 
Solution
If you are going to wait few months, you might want to wait even couple more.
the 100Hz on 3440x1440 and 60Hz on 4k are limits implied by current DP 1.2 and HDMI 2.0 standards.
when monitors move to DP 1.4 and HDMI 2.1 there will be higher refresh rates for those monitors.
HDMI 2.1 was just recently announced and while it packs very interesting features, it seems not probable to get real product during next 8-12 months.
The DP 1.4 on the other hand packs less features, but still support higher refresh rates and is already packed with your GPU. Monitors with such connection should appear later this year.
Your concerns are correct with Ultrawide, very few games currently provide support for that aspect ratio, although most AAA releases of late are beginning to include them as standard, it's not quite ripe for purchase yet.
4k would be a better choice with the 1080 Ti.
What's the budget for your setup, and do you need headphones/peripherals/OS included?
 
If you pick the 4k 60hz G-sync now you'd be stuck with 1080Ti and 1180Ti and 2080Ti...

In 20 months, when a new demanding game roll out... at 1440p/ultra, 100 fps with G-Sync down to 80 fps with G-Sync is barely noticeable.

At 4k/ultra 60 fps down to 40 fps is very noticeable.

4k/ultra 60 fps is a better choice for sandbox RPG Chugalug_ is right.

But the point to go with expensive monitors is to play in ultra preset and in 20 months if a new game roll out you are down to 40 fps what can you do?

Expect to handover $800 USD to Nvidia every 20 months.

Consider both G-sync monitors and Nvidia cards are selling at premiums so that why I have been following AMD's new Nova and Free-sync monitors.
 
4K on 27" monitor is pointless waste of electricity. 40" under 100cm viewing distance makes sense.
You will need a microscope to see the difference between 1440p and 4k at 70-100CM distance.
For a real gamer (not some wannabe) 60Hz is no go regardless of the resolution.
As for UW, they are nice, many games support it, worst case you will be playing with black bars - effectively 2560x1440.
When it works, it looks awesome.
Again, for hardcore (especially competitive) gamers, UW is no go as it's hard to see the minimaps and other things in corners without moving the eyes.
bottom line:
Hardcore gamer - 1440p@144Hz
Gamer/productivity - 3440*1440@100Hz
designer/creativity - 4K

P.S.
I have the Samsung's 34" 100Hz UW and quite happy. though it does not have the G-Sync.
 
well, you are special.
probably you can also see the difference between QHD and FHD 5" smartphone screen.
normal people with 20/20 vision will not see the difference between 1080p and 4K on 25" at ~1 meter viewing distance. As you can guess, 1440p takes it to higher screen sizes.
 
Thanks for all your answers so far. I'm inclined towards an UW 3440x1440@100Hz G-sync monitor right now. Either the Acer x34 or the Asus ROG Swift PG348Q.

I plan to not upgrade this new build for at least 5/6 years, even when that means not playing with max settings at some point. So the fact that 3440x1440 is easier to run for the GPU is a huge point. Sure I could run most games at 4k 60fps ultra now with the 1080 ti, but that's gonna change way before I'll be ready to upgrade again.

So I can keep max settings on 3440x1440 longer than at 4k. Besides, comparing a 27" 4k vs a 34" 3440x1440 UW, I've never seen them in person, but I'm pretty sure the difference can't be that big. And the higher refresh rate (100hz) it's nice too.
I've searched and most games I play support 21:9, and if they don't natively, it looks like there's always a way around it. So I think that shouldn't be a big problem.
I've got time to change my mind though, I'll get the monitor a few months from now, gotta get the PC together first.
 

One small flaw there; we're talking 27" at about half a metre; who sits 1m away when using a desktop?
Also given I have experience with both, that's a little better than speculation don't you think?
 
If you are going to wait few months, you might want to wait even couple more.
the 100Hz on 3440x1440 and 60Hz on 4k are limits implied by current DP 1.2 and HDMI 2.0 standards.
when monitors move to DP 1.4 and HDMI 2.1 there will be higher refresh rates for those monitors.
HDMI 2.1 was just recently announced and while it packs very interesting features, it seems not probable to get real product during next 8-12 months.
The DP 1.4 on the other hand packs less features, but still support higher refresh rates and is already packed with your GPU. Monitors with such connection should appear later this year.
 
Solution
AMD is rolling out Eclipse and Nova. So the price of 1080 and 1080 ti can change later in June.

I think @Chugalug_ is right for now. But in 20 months, when you would play a new AAA title and come down to 40 fps, if you had spent $800 on a monitor, would you compromise playing High preset? can you keep up with the 1180 Ti or 2080 Ti upgrade?
 


that little flaw is addressed with mentioning resolutions and screen sizes.
Your subjective experience is flawed with different panels, may be even different panel types, may be different color/contrast settings etc. We probably all have experience of looking at the same size and resolution displays side by side and very different perception.
sitting 50cm away from monitor, for me (and most people with standard desk), means to lean over the desk (70-75cm standard depth). for me and most people (spending hours in front of computer) that i know, 80-100cm is the standard distance.
And before you continue with subjective objectivity, the reason for that is that normal people, want the monitor to be within ~60 degree in order not to move the eyes too much. and yet, 60 is already "near peripheral" vision. that is ~27" monitor.
the next thing is to take a look at angular resolution.
So with 27" 2560x1440 at 1 meter (for simplicity) we have ~77pixels/degree. And back to humans, 20/20 (6/6 europe) is to be able to see 1 pixel at 60 pixels per degree. the average vision for adults is however is being able to identify 1 pixel at around 80 pixels per degree. 77 vs 80 ... huge difference.
Do you really need that for gaming ? or anything else but the finest "creativity" related work ?
May be.
for productivity (i'm a software dev) the screen estate is more important - 3 monitors at work. for gaming, refresh rate/FPS for fluidity and lower input lag is more important.
4K gaming is for "blondes" (regardless of actual sex and hair color). Same as apple shinny toys are marketed as most user friendly - yes only if you are a blonde using it to open single tab in safari or Photoshop.

EDIT:
forgot to mention that 34" 3440x1440 is 79.2 pixels per degree at 1m viewing distance.