Question Which monitor is the best?

Thanks for the reply!, also i play fortnite, is it good with fortnite?
That'll work perfect for Fortnite, just adjust graphic settings to keep FPS high although keep view distance up there for far away targets. If you have a AMD GPU then enable Freesync to make everything smooth and if your running a Nvidia GPU you might be able to use Freesync with that Nvidia GPU with the newest GPU drivers.
 
That'll work perfect for Fortnite, just adjust graphic settings to keep FPS high although keep view distance up there for far away targets. If you have a AMD GPU then enable Freesync to make everything smooth and if your running a Nvidia GPU you might be able to use Freesync with that Nvidia GPU with the newest GPU drivers.
Also, i don't have a DP cable, and have only a HDMI cable, i heard that HDMI will have black screen for several seconds in game, will that happen?
 
So if my HDMI cable works, it shouldnt have black screens right?
Also, will curve screen make it harder to watch videos?
Well if your loading into a game there might be a black or dark screen but that's normal. As for while your gaming there should be no random black screens. I haven't really delved into smaller curved screens (usually 34" UW curved) but if possible I'd highly recommend checking it out in person.
 
What if they dont have the monitor on display? Also the Asus monitor isnt to bad right? since 75 hz isnt the worst, it does have a fast ms response time.. And that monitor i can check it out since its on display.
 
75hz isn't bad at all, it's what I use on my 21:9 UW.

Have you ever considered UW? Having the extra peripheral view on the side can really help see if someone is trying to flank you. The monitor below is actually 75hz, PCPer made a mistake (I own the 34" version).

PCPartPicker part list / Price breakdown by merchant

Monitor: LG - 29WK600-W 29.0" 2560x1080 60 Hz Monitor ($229.00 @ Amazon)
Total: $229.00
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2019-03-30 15:53 EDT-0400


View: https://www.youtube.com/watch?v=BIuQU0mayFc


Also IPS will give you way better colors over any TN panel and with Fortnite's vibrant colors it looks quite good.
 
75hz isn't bad at all, it's what I use on my 21:9 UW.

Have you ever considered UW? Having the extra peripheral view on the side can really help see if someone is trying to flank you. The monitor below is actually 75hz, PCPer made a mistake (I own the 34" version).

PCPartPicker part list / Price breakdown by merchant

Monitor: LG - 29WK600-W 29.0" 2560x1080 60 Hz Monitor ($229.00 @ Amazon)
Total: $229.00
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2019-03-30 15:53 EDT-0400


View: https://www.youtube.com/watch?v=BIuQU0mayFc


Also IPS will give you way better colors over any TN panel and with Fortnite's vibrant colors it looks quite good.
I like it but it is just way to wide for my desk, also, the monitors I chose is monitors I can pick up right now, and that’s what I want, is TN Panel or the 3rd monitor panel better?
 
Also, i don't have a DP cable, and have only a HDMI cable, i heard that HDMI will have black screen for several seconds in game, will that happen?
According to the product description for the MSI Optix screen, it should include a DisplayPort cable in the box...

What's Included: Optix G24C Widescreen LCD Monitor; Power Cord; DP cable

You'll need to use the DP cable if you want to enable adaptive sync with an nvidia graphics card. And that's probably the screen I would go with out of those three. What graphics card do you have?
 
According to the product description for the MSI Optix screen, it should include a DisplayPort cable in the box...



You'll need to use the DP cable if you want to enable adaptive sync with an nvidia graphics card. And that's probably the screen I would go with out of those three. What graphics card do you have?
Thanks for the reply, I have a Amd graphics card but planning to build a nividia pc build, I will get a DP cable if it doesn’t come with one
 
If you can see the difference between 120fps and 144fps, you have better eyesight than any other human on the planet.

The upside to 120Hz over 144Hz is minimum fps. It's far easier to get 120fps output than 144fps output for many cpus in many games, so you'll stand a good chance of always staying above refresh minimums for a solid level of performance. With 144fps, it's easy enough for lesser cpus to not be ae to maintain that or better, so you end up bouncing back and forth across the cap. This can result in micro stutters in fast paced games or wiggling the mouse too quick.
 
If you can see the difference between 120fps and 144fps, you have better eyesight than any other human on the planet.

The upside to 120Hz over 144Hz is minimum fps. It's far easier to get 120fps output than 144fps output for many cpus in many games, so you'll stand a good chance of always staying above refresh minimums for a solid level of performance. With 144fps, it's easy enough for lesser cpus to not be ae to maintain that or better, so you end up bouncing back and forth across the cap. This can result in micro stutters in fast paced games or wiggling the mouse too quick.
Wdym micro stutters?
 
Actually, that's highly subjective depending on the person, their eyesight and what exactly it IS they are looking to gain from having a refresh rate that high. It doesn't matter whether you can "see" those additional frames or not. What matters is that you CAN see that the difference is smoother AND beyond that if you are gaming at very high FPS you will want a display that is capable of a very high refresh rate in order to minimize the potential for tearing, especially if you do not have monitor with some form of adaptive sync, but probably even if you do.

The information contained in the spoiler below clarifies a lot regarding this or at least offers an alternative opinion since this is something that even highly professional experts cannot seem to agree on.

What framerates can we really see?

“Certainly 60 Hz is better than 30 Hz, demonstrably better,” Busey says. So that’s one internet claim quashed. And since we can perceive motion at a higher rate than we can a 60 Hz flickering light source, the level should be higher than that, but he won’t stand by a number. “Whether that plateaus at 120 Hz or whether you get an additional boost up to 180 Hz, I just don’t know.”


“I think typically, once you get up above 200 fps it just looks like regular, real-life motion,” DeLong says. But in more regular terms he feels that the drop-off in people being able to detect changes in smoothness in a screen lies at around 90Hz. “Sure, aficionados might be able to tell teeny tiny differences, but for the rest of us it’s like red wine is red wine.”

Chopin looks at the subject very differently. “It’s clear from the literature that you cannot see anything more than 20 Hz,” he tells me. And while I admit I initially snorted into my coffee, his argument soon began to make a lot more sense.


Certainly 60 Hz is better than 30 Hz, demonstrably better.
Professor Thomas Busey
He explains to me that when we’re searching for and categorising elements as targets in a first person shooter, we’re tracking multiple targets, and detecting motion of small objects. “For example, if you take the motion detection of small object, what is the optimal temporal frequency of an object that you can detect?”

And studies have found that the answer is between 7 and 13 Hz. After that, our sensitivity to movement drops significantly. “When you want to do visual search, or multiple visual tracking or just interpret motion direction, your brain will take only 13 images out of a second of continuous flow, so you will average the other images that are in between into one image.”


Discovered by researcher Rufin vanRullen in 2010, this literally happens in our brains: you can see a steady 13 Hz pulse of activity in an EEG, and it’s further supported by the observation that we can also experience the ‘wagon wheel effect’ you get when you photograph footage of a spinning spoked object. Played back, footage can appear to show the object rotating in the opposite direction. “The brain does the same thing,” says Chopin. “You can see this without a camera. Given all the studies, we’re seeing no difference between 20hz and above. Let’s go to 24hz, which is movie industry standard. But I don’t see any point going above that.”


Perception and reaction
This article is about what framerates the human eye can perceive. The elephant in the room: how fast can we react to what we see? It's an important distinction between games and film worthy of another whole article.
So why can games feel distinctly different at 30 and 60 fps? There's more going on than framerate. Input lag is the amount of time that elapses between inputting a command, that command being interpreted by the game and transmitted to the monitor, and the monitor processing and rendering the image. Too much input lag will make any game feel sluggish, regardless of the LCD's refresh rate.
But a game programmed to run at 60 fps can potentially display your inputs more quickly, because the frames are narrower slices of time (16.6 ms) compared to 30 fps (33.3 ms). Human response time definitely isn't that fast, but our ability to learn and predict can make our responses seem much faster.
The important thing here is that Chopin is talking about the brain acquiring visual information which it can process and on which it can act. He’s not saying that we can’t notice a difference between 20 Hz and 60 Hz footage. “Just because you can see the difference, it doesn’t mean you can be better in the game,” he says. “After 24 Hz you won’t get better, but you may have some phenomenological experience that is different.” There’s a difference, therefore, between effectiveness and experience.

And while Busey and DeLong acknowledged the aesthetic appeal of a smooth framerate, none of them felt that framerate is quite the be-all and end-all of gaming technology that we perhaps do. For Chopin, resolution is far more important. “We are very limited in interpreting difference in time, but we have almost no limits in interpreting difference in space,” he says.

For DeLong, resolution is also important, but only to the small, central region of the eye that cares about it, which comprises only a couple of degrees of your field of view. “Some of the most compelling stuff I’ve seen has been with eye-tracking. Why don’t we do full resolution only for the areas of the eye where we actually need it?” But his real focus is on contrast ratios. “When we see really true blacks and bright whites it’s really compelling,” he says.

What we really know
After all of that, what do we really know? That the brain is complicated, and that there's truly no universal answer that applies to everyone.

  • Some people can perceive the flicker in a 50 or 60 Hz light source. Higher refresh rates reduce perceptible flicker.
  • We detect motion better at the periphery of our vision.
  • The way we perceive the flash of an image is different than how we perceive constant motion.
  • Gamers are more likely to have some of the most sensitive, trained eyes when it comes to perceiving changes in imagery.
  • Just because we can perceive the difference between framerates doesn't necessarily mean that perception impacts our reaction time.

So it’s not a tidy subject, and on top of all of this, we have to also consider whether our monitors are actually capable of outputting images at these high framerates. Many don’t go above 60 Hz, and Busey questions whether monitors advertised at 120 Hz really display that fast (according to some seriously in-depth testing at TFTCentral, they certainly do). And as someone who has also enjoyed games at the 30 frames per second (and often rather less) rendered by my consoles, I can relate to them suggesting that other aspects of visual displays might connect better with my visual perception.

On the other hand, I would love to hear from pro teams about their objective experiences with framerate and how it affects player performance. Perhaps they’ll corroborate or contradict science’s current thinking in this field. If gamers are so special when it comes to vision, perhaps we should be the ones to spearhead a new understanding of it.

In reality though I think one of the determining factors for that purchase should be the fact that the ASUS monitor only has HDMI and VGA inputs, while the HP monitor has both HDMI and Displayport. Modern graphics cards do NOT have VGA outputs on them, so having a monitor with a VGA input is both telling (That it is an older model) and lacking, in that it cannot be used with a variety of modern outputs from gaming cards currently being sold and PROBABLY any monitor with a VGA input is also using a much older version of HDMI, which might be a factor as well although probably not so much at 1080p.

Honestly, both those monitors have very similar specs aside from the inputs and refresh rate, but given the more modern inputs AND the higher refresh rate on the HP monitor I think I'd choose that over the ASUS, UNLESS you have a fairly old or weak graphics card and don't expect to be able to come anywhere near 144fps in the games you play, or simply don't care about being able to do that. If that's the case, then the ASUS is probably fine AND it has adaptive sync which the HP unit does not seem to have in any way whether Freesync or Gsync.