[SOLVED] 4k 60Hz to 4k 144....worth it?

Been on the fence for while regarding the high refresh craze going around. I greatly enjoy my 27" PG27AQ monitor w/ G-Sync of course. That said, I've never used a high refresh monitor up to this point. I'm not willing to spend the $1500+ ballpark on the X27 or PG27UQ. Just to much for me to consider. I'm considering the XB273K to stay at 4K, but not certain. Possibly a high refresh 1440P, but it would have to be top-notch. I previously also had an Ultrawide 34(non-gaming type), but just wasn't my thing. I guess what I'm asking is if the high refresh is really worth it. Anyone make the 4k 60 to 4k 144 jump? Even 1440p 60hz to 120+ for a comparison at same resolution. Appreciate input on this. Thanks in advance.
 
Solution
I suppose that depends on the person. Personally I would rather game on a 144hz 1080p monitor than a 4k 60hz in really any game that would run at over 60fps. 1440p 144hz vs 4k 60hz there is no question at all in my opinion, I would MUCH prefer 1440p 144hz.

4k 144hz monitors are extremely expensive and even your 2080 ti will struggle getting near 144fps at 4k in graphically demanding games. Up to you if that's worth it, in my opinion its not, maybe in a few years when 4k 144hz monitors get cheaper and GPU's get better.

Dunlop0078

Titan
Ambassador
I suppose that depends on the person. Personally I would rather game on a 144hz 1080p monitor than a 4k 60hz in really any game that would run at over 60fps. 1440p 144hz vs 4k 60hz there is no question at all in my opinion, I would MUCH prefer 1440p 144hz.

4k 144hz monitors are extremely expensive and even your 2080 ti will struggle getting near 144fps at 4k in graphically demanding games. Up to you if that's worth it, in my opinion its not, maybe in a few years when 4k 144hz monitors get cheaper and GPU's get better.
 
Solution
Thanks for the input. I agree that my 2080 Ti, outside of older games or indie games, would be nowhere near the 144 FPS on maxed settings or near that, though it's really 120 on these monitors from my understanding. Sub sampling occurs from what I've seen when going above the 120 from DP bandwidth limitations. I guess I'm trying to see if its a justified idea for "future proofing" for a few GPU generations to get to that avg. refresh range.