Hi Gurus. how much harder would it be to power a 4k monitor over a 1080p monitor. Also is 4k harder than 144hz?

Politicaleft

Distinguished
Sep 15, 2014
75
0
18,630
I have a 4k screen and i am looking to either get 3 1440ps at 144hz or a single 1440p at 144hz but wondering would i have the juice to power it reasonably. I love the crispness of 4k and do not want to downgrade.


Thanks guys


I have

r9 290 crossfires.
i5 2500k @ 4.5
 
Solution
4k is 4 times the size of 1080p (it's twice the width and height, so 4x the number of pixels), and 2.25x 1440p (1.5x per side), so assuming that there is absolutely no time loss between frames, you are looking at 1/4 fps on a 4k than 1080p, and 4/9th fps than 1440p.

It would be slightly harder to power a game at 1440p @ 144fps than it is to power it at 4k@60fps if counting only pixels and not including any hardware bottlenecks that may occur, though not by much. However 1440p @ 72fps is much more playable than 4k @ 30fps for example.

Also, it would be much easier for a GPU to render 1080p @ 144hz than @ 4k, as 1080p @ 144fps is equal to 4k @ 36 fps, again ignoring any hardware bottlenecks in the system

Entomber

Admirable
it's simple math.

a 4K screen has 4 times as many pixels as a 1080p screen, so it'll be 4 times harder to render a single frame.

3 x 1440p has even MORE pixels than a single 4K screen, so it'll be harder to render a single frame (gaming) than even 4K.

a single 1440p monitor at 144Hz requires a similar amount of processing power as a 4K monitor at 60Hz, since again we just math out the number of pixels required per identical time frame.

tl;dr
4K@60FPS ~ 1440p@144FPS
1x 1440p < 4K < 3x 1440p
 

N1KKII

Reputable
Jan 10, 2015
146
0
4,760
For gaming, go with single! Much better experience! for 1 screen at 4k approx 50fps ultra quality. ( ofc depends which game )

With multiscreen it EATS A LOT from your GPUs = much lower FPS


My suggestion is to go with a single monitor !
 

chenw

Honorable
4k is 4 times the size of 1080p (it's twice the width and height, so 4x the number of pixels), and 2.25x 1440p (1.5x per side), so assuming that there is absolutely no time loss between frames, you are looking at 1/4 fps on a 4k than 1080p, and 4/9th fps than 1440p.

It would be slightly harder to power a game at 1440p @ 144fps than it is to power it at 4k@60fps if counting only pixels and not including any hardware bottlenecks that may occur, though not by much. However 1440p @ 72fps is much more playable than 4k @ 30fps for example.

Also, it would be much easier for a GPU to render 1080p @ 144hz than @ 4k, as 1080p @ 144fps is equal to 4k @ 36 fps, again ignoring any hardware bottlenecks in the system
 
Solution