Why do people game at higher resolutions like 2k,4k,8k?I mean,what's the point?My question is for hardcore fps gamers.

ambarish kumar

Honorable
Dec 2, 2014
24
0
10,510
I am a primary fps gamer.Why do people play such titles at 4k,8k.My question is for hardcore fps gamers who play csGO,bf,cod,pubg,etc.Do you play at 4k,2k,8k?
If you got your hands on a 1080p 240hz 1ms monitor and a 4k 60hz 1ms monitor,what would be your preference,provided you had a gpu that could handle both?
 
Solution
Resolution has less to do with fps (even that it affects it) but to be able to see "more" detail and not get a screen door like effect (visible pixels when getting close).
The panel refresh hz is NOT fps.
Higher hz means you will see a less blurry image when there is movement, and the reason why most good tvs have 120hz or higher.
Higher fps in games are usually to lower the input lag (time between mouse/kb button pressed, and until it happens on the screen) and mainly for PvP/competitive gaming.
For gaming your better of with a gsync screen, as it can match screen refresh with the gpu, and wont experience studder when dropping below screen hz, or tearing when above.
It also allows for higher fps gaming when the card cant keep the fps...
For mainstream gaming I always recommend going for 1080p G-Sync or at max 1440p G-Sync(GTX 1080Ti combo not suitable for gaming setup with GPU below GTX1080 specs). Fore now 4K monitors are better suited for Production than for Gaming. For gaming on 4K monitor it is recommended to have GTX1080Ti in SLI to get majority of demanding games above 60fps mark.

1080p is best for now.
1440p is also decent.
 
Resolution has less to do with fps (even that it affects it) but to be able to see "more" detail and not get a screen door like effect (visible pixels when getting close).
The panel refresh hz is NOT fps.
Higher hz means you will see a less blurry image when there is movement, and the reason why most good tvs have 120hz or higher.
Higher fps in games are usually to lower the input lag (time between mouse/kb button pressed, and until it happens on the screen) and mainly for PvP/competitive gaming.
For gaming your better of with a gsync screen, as it can match screen refresh with the gpu, and wont experience studder when dropping below screen hz, or tearing when above.
It also allows for higher fps gaming when the card cant keep the fps steady.
 
Solution


lol.
see if its fun to play on a screen at 27" or more with 1080p res, and you will see your (almost) completely wrong,
and even on my (production grade) 32" moni even 1440p is only acceptable if distance is 2ft or more.

and there are enough screens with 4K res and g-sync, which renders the need for a sli setup and/or +60fps mute,
as the screen syncs with the gpu and it wont matter if you get 30 or 90fps (experience wise).
 


I never said anything about 27" 1080p monitor or 32" 1440p monitor. The reason I consider 1080p to be best and 1440p to be decent is because GPU power available in the market to handle them. At 1080p if you get a GTX1080Ti combo you will be getting 100+ fps in majority of games at ultra settings which is best performance. 1440p is also decent as majority of games can be played at 70fps+ at ultra settings which is decent. Majority of the titles are unable to keep up on 60fps+ at 4K ultra settings. In that case we have to bring down the settings to make the game playable. That is a compromise when you pay good amount to get the best performance out of it. At 4K SLI is highly recommended to maintain the fps above 60fps in intensive games but there are still some games which are not even able to hit 60fps with SLI unless we compromise on settings greatly. For now there is no GPU in market powerful enough to support 4K ultra on majority of titles.

As NVIDIA is launching BFGD I expect them to release a GPU powerful enough to support them. I have high expectations on Turing lineup hope NVIDIA doesn't disappoint its customers with the release of under-powered and overpriced GPUs.
 
and thats the reason why i said lol.

not connecting the res to a screen size is the first problem, and generically setting games to ultra is almost all the time wasted performance, power and money as well.

R6 siege is the best example, were you wont gain ANYTHING in IQ (read image quality) running half the settings above high/v.high,
but it can cost up to 30% in fps.
setting the game to ultra makes it unplayable @4k without a Ti and gsync, balancing the settings will make it possible to get a 1080 to run it, without looking any better (vs optimized settings).

so generically saying what res is best, or "recommending" to crank up settings to ultra (disregarding game etc),
isnt something i expect to be posted by a moni master...

look at the guide if you need some "outside" proof.
https://www.geforce.com/whats-new/guides/tom-clancys-rainbow-six-siege-graphics-and-performance-guide#tom-clancys-rainbow-six-siege-lod-quality
 
There are games that scale well with the settings. Turned to Ultra they look exceptionally well and after playing it at ultra people are unable to go down to very-high settings because details go down with it.

Example, try paying Horizon 3 at Ultra maxed out and next try playing it at a notch down you can notice the difference easily. Once you play at Ultra going down is like loosing the potentiality of game. There are many other famous and most selling titles which face the same problem. That is the reason I prefer 1080p or 1440p over 4K.

If the loss of detail is so clearly visible at 1080p at 4K it becomes even worse.