Can a monitor restrict your gaming performance

joseph9012

Commendable
Apr 14, 2016
301
0
1,790
If I get a 1440p or 1080p monitor with 60Hz G-sync on gtx 1070 and i7 6700k will the performance be restricted.

1. will it be able to get over 60 fps or do I need 144Hz?

2. If this really overpowered?

3. to have a killer pc work at its best do you need a killer monitor.

4. Will I notice a big difference if I down grade from retina 2k macbook pro to 1080p

thanks, I have looked around but can't find an answer
 
Solution


1. hz = refresh rate. 144hz means you can see 144fps. 60hz = you can only see maximum of 60fps. Your system can produce 100+fps with 60hz but you will experience screen-tearing and you will lose be generating unnecessary heat. That is where v-sync comes in :)

2.g-sync is great but at 1080p 60hz you do not need g-sync...


1. hz = refresh rate. 144hz means you can see 144fps. 60hz = you can only see maximum of 60fps. Your system can produce 100+fps with 60hz but you will experience screen-tearing and you will lose be generating unnecessary heat. That is where v-sync comes in :)

2.g-sync is great but at 1080p 60hz you do not need g-sync. 1440p it could be of benefit, as it smooths screen-tearing, input log and just makes it look sexier. It is most ideal to use it with 144hz monitor.

3. No you don't. There is technology within gpu's that allow you to upscale your resolution. i.E Nvidia's Dynamic Scaling Resolution. You can go into the control panel and enable it and you can play games at 4k resolution on your 1080p 60hz monitor. It does not mean you play native 4k res games but your GPU renders everything as if it was and then shrinks it down intelligently meaning you get a more sexier display on your screen. NOTE ... this is not the same as native 4k.
But if you are considering some s erious gaming. In my opinion its becoming more of the norm to get a 1440p monitor, 4k monitor, or a 120/144/165/200hz monitor as each one gives you a more enhanced experience in gamming then a standard 1080p 60hz monitor.

4. 2k resolution is 2048x1080p at native resolution, and 1080p monitor = 1920×1080p
meaning 2k res: 2,211,840 pixels on your screen
1080p res: 2,073,600 pixels on your screen
1440p res: 3,686,400 pixels on your screen
4k res: 8,294,400 pixels on your screen

As you can see the difference between 2k res and 1080p is minimal, so you won't notice any difference in my opinion!
But remember when you increase resolution here is the formula for the GPU:

Pixels x refresh rate x color bit depth
Bear in mind with this data below (1,000,000,000bits = 125Megabyte)
1080p(60hz monitor) = 2,073,600 * 60 * 32 = 3,981,312,000 (i.e bits of data a second)
1080p(144hz monitor) = 2,073,600 * 144 * 32 = 9,555,148,800 (i.e bits of data a second)
1440p(60hz monitor) = 3,686,400 * 60 * 32 = 7,077,888,000 (i.e bits of data a second)
1440p(144hz monitor) = 3,686,400 * 144 * 32 = 16,986,931,200 (i.e bits of data a second)
4k(60hz monitor) = 8,294,400 * 60 * 32 = 15,925,248,000 (i.e bits of data a second)
4k (144hz monitor) = 8,294,400 * 144 * 32 = 38,220,595,200 (i.e bits of data a second

This is just displaying the screen and the GTX 1070 will have a Memory Clock: 8 Gbps
So it is all up to you, but i hope that table helps you choose what is a better trade off
 
Solution