Are "Gaming" Monitors a Scam?

TheIcePhoenix

Honorable
Nov 27, 2013
39
0
10,530
I am a LONG time computer nut and a Electronics N3RD in general. I used to own my own Computer Repair store even (Blue Phoenix Custom Computers) I have had MANY different Display setups and display Brands. I have spent a lot of time with Cheap products midrange products and EXPENSIVE ($2k) and in my opinion with regards to High Priced ($1000) and up gaming monitors and same priced (4k TV's and projectors) There is ALMOST no tangible difference in quality to justify the cost of gaming monitors. In many cases the TV's in particular have better quality image and support a refresh rate that is more than acceptable for MOST gaming needs including FPS competitive sessions. and on current Gen TV's you can turn off all the Extra stuff and get some buttery smooth input lag free (as much as you can expect from a NON gaming monitor IE around 8MS which is actually really workable) I know that gaming monitors have Low 1-5 MS rates but that DOES not justify the costs of these products. I can keep going on and on but I want to ask the community what do you think and feel about this?
 
Solution
Let me preface first by saying, when I see OP like this, normally it comes from (and they finally normally admit it in the end) people whom been playing 'mostly' or 'all the time' on consoles than on PC, and don't see what the 'difference' is. They have 4K TVs and play all their games "quite nicely" and all their BluRay videos all look "better" than any Gaming monitor they have seen.

So let's break down the argument into parts. As I been researching them most recently and found there has been a evolution in the monitors and TVs in response to customer demands. A HD / UHD TV is fine and well and even 4K TVs can be pretty sweet looking, WHEN (key point) NATIVE 4K (not upscale 4K) is being displayed on it (CableTV, BluRay, built in...
I feel that the same thing can be said about basically any component today being branded as specifically made for "gaming": mice, keyboards, motherboards, cases, RAM modules... the list goes on and on. In some cases, the difference is merely the physical appearance of the component; in others, there are indeed some hardware specifics which benefit gamers. Truth be told, the difference in price is not justified in some cases (some would even argue "in all cases"). However, there will always be a market for such hardware, as long as people are willing to pay more to differentiate themselves or their equipment.

The only real feature in gaming monitors that can be considered useful (again - not for all people) is the higher refresh rate, or support for freesync/gsync. Other than that, you will likely have a great experience on any other similar monitor, usually much cheaper.
 
It depends on your priorities. Most competitive gamers will look for performance no matter the cost, and that's exactly what gaming monitors are for. I actually find 5 Ms+ response time to be ungodly slow, and I won't use a monitor that has more than 5. Most 1 ms monitors aren't that expensive as well, but most gaming monitors also support 144 Hz refresh rate, which shows more FPS on the monitor. Combo that with 1 ms response time, and a high resolution, and there's the price jump.

It's really all about priorities. If you're looking at performance, gaming monitors make sense. It's like spending $1200 on a computer for a premium, you want fast and efficient. If not, then maybe gaming monitors look like a scam, but it certainly has its place in the display industry.
 


continuing that point, content creators will spend thousands on adobe RGB/HDR certified monitors, which if you need one are absolutely not a scam, but arguably make no sense for the average person.
 
"There is ALMOST no tangible difference in quality to justify the cost of gaming monitors."
And that's why my primary gaming monitor is a 65" 4K TV! @60Hz games playing at 60FPS seem pretty smooth.
It works best for me to turn on V-Sync so there's no tearing (depending on the game) and also turn off all of the TV picture cleanup and autocorrect features.
 
1ms response time is actually a really cheap feature these days. Barely changes the cost of any TN panel. Low response IPS is a little more expensive of course, for the few models that exist. Accuracy was never a priority for gamers, though it's a huge deal for game designers of course. The only really big ticket features are high refresh rates, and adaptive refresh.

High refresh is pretty polarizing, but it's a lot like those color blindness dot pattern tests. Either it jumps out at you, or it's completely unnoticeable. And both groups have no idea what the other is experiencing. That having been said even if you clearly see the difference, it doesn't always improve every game. Some refer to the "soap opera" effect which compared shot on film vs shot on video. But most agree it makes motion smoother.

Adaptive refresh is probably the biggest feature. Again, it's hard to describe the difference until you see it. Freesync isn't that expensive, but it's also implemented a little haphazardly. Manufacturers can support large frequency ranges, or tiny ones, and still slap the Freesync logo on. G-Sync is, well, it's stupidly expensive compared to Freesync. About $200 extra minimum, compared to as little as $30 for a Freesync vs non-Freesync model. I don't know whether the G-Sync implementation is worth the extra compared to Freesync, but if you've already invested in a GPU you're out of luck anyway.
 
1. gaming monitors with 2k and 4k resolutions are harder to produce as they have a much higher pixel density then your 4k TV.
2. low response is worth it. 30ms response is not OK.
3. Nvidia gysnc/AMD freesync allow for the deivce to dictate when to refresh. This adds to the overall experience
4. 144+ refresh rates are the bees knees. You can tell a difference. Don't begin with this "The human eye can only see 30fps" BS.

Those are quite a few good reasons as to why gaming monitors ask for a higher price premium. Of course if you just say "oh but you don't need it" then it's not for you.
 


1.) Very true
2.) 30ms response time? Do those even exist anymore? The highest I've seen is 10ms; and for the 'average non-competitive' gamer a difference isn't noticed between that and 2ms. I personally own a 7ms and a 2ms monitor and haven't noticed anything at all between the two performance wise. In competition such as as CS:GO or Battlefield 1, it might be noticeable for some.
3.) Gsync/freesync are very nice features to have as they most definitely help with screen tearing.
4.) as with 144+ refresh rates, I honestly have not seen a difference; even with 2 monitors side by side each other. That doesn't mean that everyone doesn't see it and it's just a placebo effect though. Each person is different.

I actually thought there would be noticeable difference; just like the first time I saw a 1080p vs a 720p TV. With a 4k TV though, that difference is much more minor. I'm thinking gaming monitors are the same thing.

I can't say that 'gaming' monitors are a scam though. As you just said, I don't have a need for it. I will say some 'gaming' products are definitely a scam though *cough Razer cough*. Biggest thing is how much you're willing to spend to make sure you have the best; whether needed or not.
 
Let me preface first by saying, when I see OP like this, normally it comes from (and they finally normally admit it in the end) people whom been playing 'mostly' or 'all the time' on consoles than on PC, and don't see what the 'difference' is. They have 4K TVs and play all their games "quite nicely" and all their BluRay videos all look "better" than any Gaming monitor they have seen.

So let's break down the argument into parts. As I been researching them most recently and found there has been a evolution in the monitors and TVs in response to customer demands. A HD / UHD TV is fine and well and even 4K TVs can be pretty sweet looking, WHEN (key point) NATIVE 4K (not upscale 4K) is being displayed on it (CableTV, BluRay, built in Wifi SmartTV Java Youtube app, etc.). As the broadcaster has done all the 'rendering' heavy load work already and is just 'streaming' the video output, if your input (4K supported Native) matches it then shows the video feed beautifully. The same is said about BluRay, and NO I am not saying the Bluray is doing the rendering, that was already captured and stored on the BluRay itself (the video) and all the BluRay is doing is converting it back from stored to 'streamed' output, so again all the heavy rendering was done in creating the Blu Ray storage medium.

Now what fools alot of the 'console' gamers (as happened at E3 this year check the news) was UPSCALED gaming. That is to say, taking a 1080p 'stream' of the gaming and instead rendering it at 4K (difference size not just detail level) display and then 'fill in the holes' with copied data of the stream is what the 4K TVs CPUs (yes I was VERY surprised some even have QUADCORE Cpus built into them) to do that 'load'. But we also know, Jaime's PS4 will perform THE EXACT same way as Manny's PS4 edition of ______ (put in game name) because all games on consoles are set to that 'same standard' so there is no differences, per design, in performance EVER. So except which TV Manny buys as compared to Jaime, which then causes additional 'lag' as the TV does the rendering UPSCALE, do they (Manny / Jaime) ever complain about how "laggy" it is.

PC Gaming though is a totally different animal as the PC itself is rendering and creating the graphics in REAL TIME. That requires alot of horsepower as a 'broadcaster', and normally 4K Upscaling won't cut it because you see the differences in the textures and 'models' to create the virtual space. Even a 1080TI can't output 4K Ultra Graphics (4K native models and textures) Mass Effect: Andromeda (http://www.pcgamer.com/mass-effect-andromeda-pc-performance-analysis/) much less GTA V 4K Mod (http://kotaku.com/gta-v-mod-adds-4k-textures-game-looks-utterly-ridiculo-1745334649) without dropping frame rates down into the 20s and 30s which isn't "playable" (playable is considered constant fluid motion and interaction with the game, no stuttering, delays, or lack of 'response' when the player is trying to interact), especially in competitive (any MMO / online FPS is competative) gaming model.

So on the Monitor side they looked at HOW to improve things as there is even a limitation to the PC hardware itself trying to do all the work. First they increased the capability of the HZ to actually display more than the standard 30Hz TVs did, then more than the common 60Hz CRTs did, and while 75Hz to 100Hz was 'okay' and more expensive better quality, it didn't impact people till they tweaked it to a steady 120Hz, then 144Hz. Linus did some videos (https://www.youtube.com/watch?v=a2IF9ZPwgDM) about the difference, and when people used 144Hz (adjusted and 'got used to them') they would seriously be 'impacted' when dropping back down to 60Hz. The sluggishness, responsiveness, etc. wasn't there, many actions (like in CS:GO) would fail when competitive game playing, because of the difference in WHEN something was displayed on the screen as compared to WHEN your mind / body (hands and fingers) were inputting the data (jump, shoot, etc.) even though the OUTPUT (144FPS+) was still the same from the PC. Matching the REFRESH of the PANEL to ACTUALLY DRAW the 144 frames and get all that extra 'movement' data in between for your eyes to process (twitch shoot) as compared to 60 frames of data was a dramatic difference.

Competitive games like CSGO typically the gamer will 'drop down' the resolution (1024x768) and graphics (low) as they focus on specific 'pixel boxes' of detail to get the "no scope headshot @100-200m away" kills in games. They don't care about pretty graphics, just the 'pixel box' difference, the change of individual pixels that they react to ONLY and 'hit' on when playing (solid black to a lesser black -BAM! kill shot). Well to refine MORE pixels for even more detail (like on PUBG shooting someone at 800m or more HEADSHOT) they rolled out the LARGER screen displays (19", 21", 25", 27", etc.) and then higher screen resolutions (1600×900, 1920×1080, 2560×1440 and 3840×2160) all caused MORE LAG (Response time) on the actual LCD to the input, even when the the PC was using lower details. So they worked on integrating like TVs, CPUs to 'process' on the monitors the data input and decrease the RESPONSE time into the 20s, then 10s, then finally single digits, till now 1-3ms is most common. AKA what is 'sent' by the PC in 1ms is displayed on the screen, not in 10ms or 30ms (30 times later to respond) and cause you to be 'too slow' to react to the other player (all other things being equal).

Fast Forward to TODAY: With serious gaming PC hardware (i7, SSD, 1080, etc.) and data is sent to the 'screen' there is some serious issues like TEARING and such occurring. That is to say, a 1080P 60Hz display receiving 200FPS of CSGO ONLY can use 60 FRAMES of the 200 sent, so what happens to the other 140 frames? They get dropped, and like a 'flood' more is sent, so you visually see artifacting (square blocking) and 'tearing' of the images across the screen. If you don't have the 'serious' game hardware (typical A10, 5400RPM HDD, running iGPU) and your only chunking out 30FPS or so as "good enough" to play, but still using the same 1080P 60Hz display which WANTS, NEEDS, BEGGING FOR 60 FRAMES of data, but your outputting 18FPS then 29 then down to 13 then back up to 30 during loading, causes all sorts of rubberbanding, drops, lagging, stuttering etc. 'on the screen' because the Monitor is trying to 'duplicate' the 30FPS (when it gets it) to 'fill in' the missing 30 frames on the 60 Hz display, and gets worse when drop down to 13FPS (60 - 13 means 4 copies (4x13=52) + 8 random frames made copies of to 'fill' the 60 HZ). All this requires PROCESSING and PROGRAMMING into the LCD to do all this sort of work as quickly (1ms-3ms) as possible then jump to the next set of 60 frames, and so on, all gets worse and worse.

THIS is where FreeSync and GSync come into play, but two different paths. FreeSync (which is the lesser performance as noted in many test videos) is open standard hardware and software that causes BETTER THAN VSYNC ever could Synchronization between what the PC says (HEY I am only sending 47 Frames at you right now!) and the LCD does (OKAY I will drop my HZ down to 47 so I match you). GSynch (which performs much better) is custom hardware integration in the LCD to 'process' and 'perform' the workload and use software to 'control' communication between the PC (Hey I am only sending 54 Frames) to the LCD (okay processing 54 frames!).

As of this Spring / Summer shows, they released now the 'Holy Grail' 144Ms @ 4K GSync which now you get the 'speed', non tearing / lagging processing, all with the PRETTY high resolution (https://www.geforce.com/whats-new/articles/nvidia-g-sync-hdr-35-inch-curved-monitor) also known as HDR displays.

So why do the monitors cost so much more? In Summary: Building a THIN, LiGHTWEIGHT LCD display, that supports up to 4K or lower resolution and detail, with 1-3ms response time, which Synchronizes to the OUTPUT unit's frame rate, on large (27+ ") displays COSTS alot to manufacture, especially with using In-plane switching (IPS) which is noticeably very much crisper / detailed screen technology than old 1980s twisted nematic field effect (TN) matrix LCDs.
 
Solution