[SOLVED] Is 720p monitor good enough for competitive gaming?

Nov 4, 2019
14
0
10
I have a 720p monitor and it’s old but works. However I play games likes rainbow six seige where it’s quite competitive and being able to spot and see enemies clearly at long-mid range is Crucial, Especially enemies that camouflage behind desks. However most enemies at long range are pixels on my screen (not litterly) but they are quite blurry. So does that mean I should upgrade to 1080p because I don’t really care about visuals in terms of eye candy, I want the best in a competitive standpoint and I’ve heard some cs pros use 720p monitors as appose to 1080p because it’s more stretched so enemies can be seen better . But honestly maybe that was a time of the pasta and before I splash my cash on a brand new 1080p monitor I would like to know some info regarding this .
 
Solution
Don't forget refresh rate and response time. Your 720p resolution may not be a handicap, but your (assumed) 60Hz refresh rate and (?) pixel response time certainly would be.

I'm a firm believer that you'd have to be pretty high up in the professional circuits to take advantage of anything >165Hz. But a 1080p 144Hz variable refresh rate monitor would be just what the doctor ordered.

Nividia GPUs (GTX10xx and newer) can run variable refresh on FreeSync monitors since January 2019. No point in buying true GSync anymore (not to be confused with "GSync Compatible")
I would think that it would be harder to see enemies at lower resolutions due to the significant loss of detail compared to 1080P. You can get high refresh-rate 1080P monitors fairly cheap these days as 1440P is becoming the new standard.

I know a few years ago some competitive gamer were still using older CRT monitors which were 720P but that was mainly due to refresh rates on LCD monitors having some limitations which have been worked out for the most part.
 
Nov 4, 2019
14
0
10
Are they actually blurry, or are they more like pixelated but difficult to distinguish from other objects?

Blurry vs pixelated are two distinct issues.

Do you feel you're not playing well because of the low resolution?
Well pixelated to the point that their head would look like an object On the table which would cause me to think that there is nobody there, not really blurry I don’t think
 
Nov 4, 2019
14
0
10
I would think that it would be harder to see enemies at lower resolutions due to the significant loss of detail compared to 1080P. You can get high refresh-rate 1080P monitors fairly cheap these days as 1440P is becoming the new standard.

I know a few years ago some competitive gamer were still using older CRT monitors which were 720P but that was mainly due to refresh rates on LCD monitors having some limitations which have been worked out for the most part.
Yeah I agree with you thanks for replying <3
 

IhaveworstPc

Reputable
Mar 30, 2017
42
0
4,530
1080p will give you better angle and better clarity for sniping and spotting stuff. Dont go for a 2k monitor or 4k. Coz its too costly and u need a rtx 2080 or rtx 2070 to get the frames higher. Get a 1080p 240hz monitor accordingly to ur graphic card. If u have nvidia card buy a Gsync monitor. If u have amd card buy a freesync monitor. Everything that matters in competitive games is high refresh rate so get a 240hz. And u need a nice card to get that fps.

i m telling u from my experience that i play at a 60hz monitor. But when i played on a 240hz monitor on a friends pc with rtx 2070. The smoothness in game increased 4 times. And it feels like a really slow pace game compared to 60hz. Just see the comparison btwn 60 and 240hz. I usually get 3-4 kills per match but the 240hz smoothness will grant u 12 kills per match even when u havent played in months.

try considering my opinion coz i felt that thing. And i m also planning on buying 240hz 1080p monitor.


All the streamers play on a 1080p 240hz (or 2k and 4k 240hz which is unaffordable for normal gamers). And so they get wins like a charm. Its all about ur fps and monitor refresh rate. U will see most streamers dont play games at max settings so that they can get the smooth 240 fps on the monitor.
 
Last edited:
Don't forget refresh rate and response time. Your 720p resolution may not be a handicap, but your (assumed) 60Hz refresh rate and (?) pixel response time certainly would be.

I'm a firm believer that you'd have to be pretty high up in the professional circuits to take advantage of anything >165Hz. But a 1080p 144Hz variable refresh rate monitor would be just what the doctor ordered.

Nividia GPUs (GTX10xx and newer) can run variable refresh on FreeSync monitors since January 2019. No point in buying true GSync anymore (not to be confused with "GSync Compatible")
 
Solution

King_V

Illustrious
Ambassador
Get a 1080p 240hz monitor accordingly to ur graphic card. If u have nvidia card buy a Gsync monitor. If u have amd card buy a freesync monitor. Everything that matters in competitive games is high refresh rate so get a 240hz. And u need a nice card to get that fps.

Absolutely not - this is bad advice for two reasons:
  1. Nvidia's 10-, 16-, and 20- series cards all support FreeSync. GSync cost extra, and ONLY worked with Nvidia cards. Nvidia eventually saw the light, and started letting their current cards work with FreeSync. There is NO REASON to spend extra money on a GSync monitor today to get adaptive sync.
  2. The human eye cannot distinguish 240Hz, and the human body can't react that fast. I would be very surprised to find that the human eye can perceive much about 100Hz, if that much. The real reason to get a 144Hz monitor today is because they usually cost just about the same as a 75Hz monitor would.

I'll believe that people can tell the difference between various 120 and up refresh rates when they have a double-blind study done to prove it.
 
Nov 4, 2019
14
0
10
Absolutely not - this is bad advice for two reasons:
  1. Nvidia's 10-, 16-, and 20- series cards all support FreeSync. GSync cost extra, and ONLY worked with Nvidia cards. Nvidia eventually saw the light, and started letting their current cards work with FreeSync. There is NO REASON to spend extra money on a GSync monitor today to get adaptive sync.
  2. The human eye cannot distinguish 240Hz, and the human body can't react that fast. I would be very surprised to find that the human eye can perceive much about 100Hz, if that much. The real reason to get a 144Hz monitor today is because they usually cost just about the same as a 75Hz monitor would.
I'll believe that people can tell the difference between various 120 and up refresh rates when they have a double-blind study done to prove it.
[/QUOTE
Thanks dude
 
Nov 4, 2019
14
0
10
Don't forget refresh rate and response time. Your 720p resolution may not be a handicap, but your (assumed) 60Hz refresh rate and (?) pixel response time certainly would be.

I'm a firm believer that you'd have to be pretty high up in the professional circuits to take advantage of anything >165Hz. But a 1080p 144Hz variable refresh rate monitor would be just what the doctor ordered.

Nividia GPUs (GTX10xx and newer) can run variable refresh on FreeSync monitors since January 2019. No point in buying true GSync anymore (not to be confused with "GSync Compatible")
Thanks a lot I really should invest in a 1080p 144hz , and does it put more
load on the gpu if u have a higher refresh rate monitor?
 
The GPU outputs frames as fast as it can, regardles of whether it's hooked up to a 60Hz monitor or a 1,000,000Hz monitor.

Higher RESOLUTION will lower frame rates at the same game quality settings though (if you're not already CPU limited, which you probably are at 720p)
 

King_V

Illustrious
Ambassador
Thanks a lot I really should invest in a 1080p 144hz , and does it put more
load on the gpu if u have a higher refresh rate monitor?

For a given resolution and level of detail, yes, a higher refresh rate (and thus higher number of frames being output) does put more load on the GPU.

The GPU outputs frames as fast as it can, regardles of whether it's hooked up to a 60Hz monitor or a 1,000,000Hz monitor.

Well, yes and no. Typically, if your monitor, say, only is 60Hz, then you'd probably turn VSync on, and thus the GPU would be limited to producing 60 frames/sec.

I can always tell when my son has installed a new game and either has not turned on VSync, or has not put a frame-rate cap on, because I'll hear the fans whirring as the GPU tries to crank out as many frames as it can. Once he sets it down to not go over 60, the fans quiet down.