I have that monitor, it's really nice for 3D gaming.
For the whole fcking 50~60 to 60fps "gaming" BS, it's just that BS. Human eye receives a continuous stream of light and use's a chemical sensor to translate it into electrical signal for your brain. Its the brain, not the eyes, that does the processing and determines what you see and don't see.
Since the eye is constantly streaming electrical signals to the brain then it can stated that the eye "see's" at 1000 fps, or whatever number you want. Its your brain that the limit on individual frames will exist and it's MUCH lower then what your eye is chemically capable of sensing. Your brain can't process more then 20~24 full distinct images per second, anything higher and the images start to blur together. Brain is designed to look for differences in light patterns, not strictly the patterns themselves. Thus high contrast frames are noticeable where low contrast frames are not. If you were flashing a queen of hearts at 60fps and on the 49th frame you put a jack of spades, very few humans (virtually none) would be able to tell you what was flashed, only that ~something~ was different in the picture. Now have a video at 60fps of the same queen but this time it's moving around, on the 49th frame flash the same jack of spades, a significantly less portion of the population will even know something was different. This is because even though its 60 frames, 59 of them are identical and thus what you really have is 2fps, one frame being 983.4ms and another being 16.6ms. Once you start moving things the frames start blending together and the brain can't keep up with such a small difference. This gets even more evident if instead of a white/black or red/black high contrast you move to a forest green/lime green type low contrast. The difference becomes imperceptible to the human brain and the change doesn't even register.
So in the end what you get is that the brain is able to easily detect high contrast changes, or large changes (fast motion) of light patterns at a high rate of speed in excess of 50~60, 16.6ms refresh. That same brain won't be able to differentiate what the difference was, only that there ~was~ a difference. At 42ms the brain still can't tell exactly what the changes are, only that their are changes. Once you move to subtle changes then the brain has an even harder time differentiating between those and telling that anything changed at all.
Conclusion, the brain doesn't use "FPS" as a metric, different light patterns are perceived differently and at different "speeds". In the context of monitors, a 60hz will provide you with a frame refresh time of 16.6ms, a 100hz monitor will provided you with a refresh time of 10ms. To put it in perspective, 60hz is 60/1000 or 0.06 of a second, 10ms is 10/1000 or 0.01 of a second. No human reflex's on the planet are that fast. It takes longer for the signal to go from the server to your PC, through your NIC through the CPU and software then to your eyes, through your brain and central reasoning system, then the difference of those frames.
There is ~zero~ competitive advantage in 100hz vs 60hz, anyone who tells you otherwise is just blowing smoke up your a$$ hole. Simply put, YOU the human and your reflex's are the bottleneck in gaming performance, not the screen's refresh rate.