Need help choosing a monitor (PC Gaming newbie)

Sep 28, 2018
4
0
10
Hey guys ive made the step up from PS$ to buying my first gaming PC, here are the specs

i5 8600 3.5ghz (overclock to 4.8ghz)
16gig Ram
Nvidia 1060 6gig STRIX OC

I have my pc hooked up to my sony bravia 32" TV but I know I need a proper monitor

Ive done a lot of research but as a PC newbie im overwhelmed by all the terms, monitors to choose from and the range of opinions so I need some advice on the best 1440p 75hx monitor to get

The reason why I choose this is 75hz will give me 75fps (from my understanding) and 75FPS is more than enough to play games (at a non competitive level)

I also wants my games too look better rather than have 100+ fps at 1080p. I know the 1060 is a beast for 1080p but after watching youtube videos it can also run 1440p well

My sony TV makes games (on ultra) look pixelated and not much different from my PS4

These are the games I currently play (with GFX on high/ultra @ 1080p)

Hunt Showdown (avg 62fps)
WreckFest (avg 88fps)
Planet Coaster (avg 90fps)
RB6 Siege (avg 80fps)
PUBG (avg 75fps)

Sorry the long winded post but just needed to get my opinions and thoughts across

Thanks guys
 
Solution
Honestly, about "you can't go back" - I'd personally be more inclined to say that about ultra-wide monitors.

They're not necessarily everyone's cup of tea, but I think going from a 16:9 aspect ratio to a 21:9 (approx) aspect ratio is amazing.

Plus, if you go 2560x1080 instead of 2560x1440, it's less taxing on the GPU.

Dropping or eliminating anti-aliasing, from what I've read, is probably one of the things that'll give a boost, as AA is very taxing on the GPU.


Doing 60fps on a 75Hz monitor - well, if the game tends to play around 60, but under 75, I'd set the refresh on the monitor to 60, and then turn on Vsync.

On the other hand, if you get a monitor with GSync (they tend to be pricey), then Nvidia's adaptive refresh works...

King_V

Illustrious
Ambassador
Depending on the games, a 1060 is suitable for 60fps at max settings.

If you're going 2560x1440, or 3440x1440, you would need a more powerful video card. The GTX 1070 is considered a 2560x1440@60fps max settings card.

Consider this:
1920x1080 = 2,073,600 pixels
2560x1440 = 3,686,400 pixels

2,073,600 divided by 3,686,400 = 0.5625.

So, whatever fps you're getting now, assuming you're not CPU-limited, you would multiply that fps number by 0.5625 to get an approximate idea of how many fps you'd get going to a 2560x1440 monitor.


That point aside, monitors are EXTREMELY subjective.

For example: I think a 27" monitor at 1920x1080 looks great. Others think the pixels are too big
I think a 34" 2560x1080 monitor looks great. Others think the pixels are too big
I think that going past 75, to refreshes in the 140 or above, are ridiculous and the human eye can't process that. Others disagree.

etc.

Not to mention that even from one brand to another with similar technology, the screens may look better or worse with color accuracy, sharpness, etc.

ie: I had a TN type of monitor for years, then moved to an IPS. My son got a TN monitor, but it was so much better than the TN that I originally had that I thought it was almost comparable to an IPS, in terms of color and sharpness.


If at all possible, even if it's a bit of a distance, I'd strongly recommend finding a store that has several different monitors of different sizes, resolutions, etc., on display that you can see, can try out with web-browsing, a little, etc. See what works best for YOUR eyes.

What I think is great, you may think looks like crap, and vice-versa. Reviews do help, of course, but seeing really is believing.
 

King_V

Illustrious
Ambassador


That may be so, but your monitor can only display 60fps.

If you use Vsync, you'll only get 60fps, if you don't use Vsync, your monitor can still only display 60fps, but you'll get tearing (one section of the screen showing the next frame, other section showing previous frame, etc, because the GPU is pushing out frames faster than the monitor can display them).
 
Sep 28, 2018
4
0
10
hey KING_V

i get the math behind it all as in half the frame rate here or there.... but ive seen many videos on youtube with the fps comparison from 1080p to 1440p. The frame rates do drop but not by half maybe 33% which is fine by me.

I do not have any PC warehouses near me to see the difference unfortunately but there ive seen a lot of comments saying once you go 1440p you cant go back to 1080

one question I have... say i do run games that go above 60fps will it make any difference at all having 60hz or 75hz monitors? its all so confusing

ps. ANy links to monitors would be great (amazon preffered, or anything based in UK)
 
Sep 28, 2018
4
0
10
BenQ GW2765HT LED IPS 27 inch Widescreen Multimedia Monitor (16:9 2560 x 1440, 1000:1, 20M:1, 4 ms GTG, DVI/DP1.2/HDMI1.4 and Speakers)

anyone used one of these? looks perfect for me
 

King_V

Illustrious
Ambassador
Honestly, about "you can't go back" - I'd personally be more inclined to say that about ultra-wide monitors.

They're not necessarily everyone's cup of tea, but I think going from a 16:9 aspect ratio to a 21:9 (approx) aspect ratio is amazing.

Plus, if you go 2560x1080 instead of 2560x1440, it's less taxing on the GPU.

Dropping or eliminating anti-aliasing, from what I've read, is probably one of the things that'll give a boost, as AA is very taxing on the GPU.


Doing 60fps on a 75Hz monitor - well, if the game tends to play around 60, but under 75, I'd set the refresh on the monitor to 60, and then turn on Vsync.

On the other hand, if you get a monitor with GSync (they tend to be pricey), then Nvidia's adaptive refresh works wonders, with the monitor's refresh adapting itself to whatever the card can handle at a give moment.
 
Solution