Let me preface first by saying, when I see OP like this, normally it comes from (and they finally normally admit it in the end) people whom been playing 'mostly' or 'all the time' on consoles than on PC, and don't see what the 'difference' is. They have 4K TVs and play all their games "quite nicely" and all their BluRay videos all look "better" than any Gaming monitor they have seen.
So let's break down the argument into parts. As I been researching them most recently and found there has been a evolution in the monitors and TVs in response to customer demands. A HD / UHD TV is fine and well and even 4K TVs can be pretty sweet looking, WHEN (key point) NATIVE 4K (not upscale 4K) is being displayed on it (CableTV, BluRay, built in Wifi SmartTV Java Youtube app, etc.). As the broadcaster has done all the 'rendering' heavy load work already and is just 'streaming' the video output, if your input (4K supported Native) matches it then shows the video feed beautifully. The same is said about BluRay, and NO I am not saying the Bluray is doing the rendering, that was already captured and stored on the BluRay itself (the video) and all the BluRay is doing is converting it back from stored to 'streamed' output, so again all the heavy rendering was done in creating the Blu Ray storage medium.
Now what fools alot of the 'console' gamers (as happened at E3 this year check the news) was UPSCALED gaming. That is to say, taking a 1080p 'stream' of the gaming and instead rendering it at 4K (difference size not just detail level) display and then 'fill in the holes' with copied data of the stream is what the 4K TVs CPUs (yes I was VERY surprised some even have QUADCORE Cpus built into them) to do that 'load'. But we also know, Jaime's PS4 will perform THE EXACT same way as Manny's PS4 edition of ______ (put in game name) because all games on consoles are set to that 'same standard' so there is no differences, per design, in performance EVER. So except which TV Manny buys as compared to Jaime, which then causes additional 'lag' as the TV does the rendering UPSCALE, do they (Manny / Jaime) ever complain about how "laggy" it is.
PC Gaming though is a totally different animal as the PC itself is rendering and creating the graphics in REAL TIME. That requires alot of horsepower as a 'broadcaster', and normally 4K Upscaling won't cut it because you see the differences in the textures and 'models' to create the virtual space. Even a 1080TI can't output 4K Ultra Graphics (4K native models and textures) Mass Effect: Andromeda (http://www.pcgamer.com/mass-effect-andromeda-pc-performance-analysis/) much less GTA V 4K Mod (http://kotaku.com/gta-v-mod-adds-4k-textures-game-looks-utterly-ridiculo-1745334649) without dropping frame rates down into the 20s and 30s which isn't "playable" (playable is considered constant fluid motion and interaction with the game, no stuttering, delays, or lack of 'response' when the player is trying to interact), especially in competitive (any MMO / online FPS is competative) gaming model.
So on the Monitor side they looked at HOW to improve things as there is even a limitation to the PC hardware itself trying to do all the work. First they increased the capability of the HZ to actually display more than the standard 30Hz TVs did, then more than the common 60Hz CRTs did, and while 75Hz to 100Hz was 'okay' and more expensive better quality, it didn't impact people till they tweaked it to a steady 120Hz, then 144Hz. Linus did some videos (https://www.youtube.com/watch?v=a2IF9ZPwgDM) about the difference, and when people used 144Hz (adjusted and 'got used to them') they would seriously be 'impacted' when dropping back down to 60Hz. The sluggishness, responsiveness, etc. wasn't there, many actions (like in CS:GO) would fail when competitive game playing, because of the difference in WHEN something was displayed on the screen as compared to WHEN your mind / body (hands and fingers) were inputting the data (jump, shoot, etc.) even though the OUTPUT (144FPS+) was still the same from the PC. Matching the REFRESH of the PANEL to ACTUALLY DRAW the 144 frames and get all that extra 'movement' data in between for your eyes to process (twitch shoot) as compared to 60 frames of data was a dramatic difference.
Competitive games like CSGO typically the gamer will 'drop down' the resolution (1024x768) and graphics (low) as they focus on specific 'pixel boxes' of detail to get the "no scope headshot @100-200m away" kills in games. They don't care about pretty graphics, just the 'pixel box' difference, the change of individual pixels that they react to ONLY and 'hit' on when playing (solid black to a lesser black -BAM! kill shot). Well to refine MORE pixels for even more detail (like on PUBG shooting someone at 800m or more HEADSHOT) they rolled out the LARGER screen displays (19", 21", 25", 27", etc.) and then higher screen resolutions (1600×900, 1920×1080, 2560×1440 and 3840×2160) all caused MORE LAG (Response time) on the actual LCD to the input, even when the the PC was using lower details. So they worked on integrating like TVs, CPUs to 'process' on the monitors the data input and decrease the RESPONSE time into the 20s, then 10s, then finally single digits, till now 1-3ms is most common. AKA what is 'sent' by the PC in 1ms is displayed on the screen, not in 10ms or 30ms (30 times later to respond) and cause you to be 'too slow' to react to the other player (all other things being equal).
Fast Forward to TODAY: With serious gaming PC hardware (i7, SSD, 1080, etc.) and data is sent to the 'screen' there is some serious issues like TEARING and such occurring. That is to say, a 1080P 60Hz display receiving 200FPS of CSGO ONLY can use 60 FRAMES of the 200 sent, so what happens to the other 140 frames? They get dropped, and like a 'flood' more is sent, so you visually see artifacting (square blocking) and 'tearing' of the images across the screen. If you don't have the 'serious' game hardware (typical A10, 5400RPM HDD, running iGPU) and your only chunking out 30FPS or so as "good enough" to play, but still using the same 1080P 60Hz display which WANTS, NEEDS, BEGGING FOR 60 FRAMES of data, but your outputting 18FPS then 29 then down to 13 then back up to 30 during loading, causes all sorts of rubberbanding, drops, lagging, stuttering etc. 'on the screen' because the Monitor is trying to 'duplicate' the 30FPS (when it gets it) to 'fill in' the missing 30 frames on the 60 Hz display, and gets worse when drop down to 13FPS (60 - 13 means 4 copies (4x13=52) + 8 random frames made copies of to 'fill' the 60 HZ). All this requires PROCESSING and PROGRAMMING into the LCD to do all this sort of work as quickly (1ms-3ms) as possible then jump to the next set of 60 frames, and so on, all gets worse and worse.
THIS is where FreeSync and GSync come into play, but two different paths. FreeSync (which is the lesser performance as noted in many test videos) is open standard hardware and software that causes BETTER THAN VSYNC ever could Synchronization between what the PC says (HEY I am only sending 47 Frames at you right now!) and the LCD does (OKAY I will drop my HZ down to 47 so I match you). GSynch (which performs much better) is custom hardware integration in the LCD to 'process' and 'perform' the workload and use software to 'control' communication between the PC (Hey I am only sending 54 Frames) to the LCD (okay processing 54 frames!).
As of this Spring / Summer shows, they released now the 'Holy Grail' 144Ms @ 4K GSync which now you get the 'speed', non tearing / lagging processing, all with the PRETTY high resolution (https://www.geforce.com/whats-new/articles/nvidia-g-sync-hdr-35-inch-curved-monitor) also known as HDR displays.
So why do the monitors cost so much more? In Summary: Building a THIN, LiGHTWEIGHT LCD display, that supports up to 4K or lower resolution and detail, with 1-3ms response time, which Synchronizes to the OUTPUT unit's frame rate, on large (27+ ") displays COSTS alot to manufacture, especially with using In-plane switching (IPS) which is noticeably very much crisper / detailed screen technology than old 1980s twisted nematic field effect (TN) matrix LCDs.