BenQ XL2420G G-Sync Monitor Review

Status
Not open for further replies.
I can understand why the display folks thread lightly, here. This price point is a minefield and miss-mash of tech. This monitor might be a bargain at $300.

Otherwise, $500-$600 is Eyefinity/Surround, HD Projector, 60-inch Plasma TV territory. It's a tough sell for a display port-capable monitor these days in Adaptive/G -Sync, being that we are only moments before the flood.

And, the market is not in the high-end, at least, not for long. It's back where us poor slubs with 'pitiful' $200 gaming cards :lol: need the help.

 

bogda

Distinguished
Aug 26, 2011
12
2
18,515
Finally a G-Sync monitor with more inputs than one DP. Now give me 1440p resolution and price below 500 and I will consider it.
 
Just throwing out there - saying that the brighter the monitor the better is absolutely far from the truth. If you're in an office environment under florescent lighting, then sure, but if you're interested in a gaming computer, it's probably in your room or a den, and, well... brighter is only better if you only game at night, or have every possible light on while you do.

From all the gripes about how the monitor gets so dim, that's actually a HUGE benefit for me - nobody every bothers to test what the minimum brightness a monitor can achieve, and many monitors, especially gaming monitors, fail miserably. I'd much rather have less eye strain than picture-perfect colors when I'm gaming at night.
 

blakphoenix

Distinguished
Nov 11, 2010
47
0
18,540
I'm sorry but there is no way this monitor could be classed as having "professional-level color accuracy". For starters it doesn't even get 100% sRGB, let alone the terrible sub 70% AdobeRGB performance. Is it good enough for games? It may well be. Is it professional level colour? Not even close!
 
Just throwing out there - saying that the brighter the monitor the better is absolutely far from the truth. If you're in an office environment under florescent lighting, then sure, but if you're interested in a gaming computer, it's probably in your room or a den, and, well... brighter is only better if you only game at night, or have every possible light on while you do.

From all the gripes about how the monitor gets so dim, that's actually a HUGE benefit for me - nobody every bothers to test what the minimum brightness a monitor can achieve, and many monitors, especially gaming monitors, fail miserably. I'd much rather have less eye strain than picture-perfect colors when I'm gaming at night.
Brightness does have one important use; ULMB. In ULMB, because the backlighting pulses, it causes these monitors to be very dim. Extra brightness helps to counteract this issue, but I do agree, testing the minimum might be useful, as there are some people out there that have to play at nearly the lowest levels due to headaches.
 
Wow very sweet for a monitor. I would highly consider this monitor if I can afford it. Since I use my computer for both work and play.

However, for that price I'm disappointed it's only 1080P. I hope they come out with a free sync 1440P version that's cheaper than this monitor.
 

uglyduckling81

Distinguished
Feb 24, 2011
719
0
19,060
Why even bother having an option to turn G-Sync and the refresh down? Who is spending over $500 on a 24" monitor and then turning all the expensive features off?
Be like buying a Ferrari and having a button that makes it drive and run like an old Datsun Sunny.
 

mabrams

Distinguished
Nov 27, 2010
4
0
18,520
This is nothing but Nvidia trying to ring out every penny of profit from G-Sync.

Not really understanding this. Are you saying that Nvidia was responsible for BenQ making this monitor? If that was the case, why would BenQ even bother adding in a second scaler so you could use sources other than nvidia video cards?

About the XL2420G itself, I've owned one for about a week before returning it due to some dead pixels. Other than that, it was great. It was easily the most color-accurate TN panel I've ever seen. Motion was great with very quick pixel response times with low motion blur, and it was overall a great monitor even without gsync. With gsync I'd say it's worth $550 I spent.

Regardless of the politics behind why Nvidia made gsync or why they made their own custom scaler for it... once you see it, play games with a dynamic refresh rate locked to GPU FPS without the noticeable input lag and microstutter of vsync, you realize it's worth the price premium. it sounds subtle on paper, but the persistence and immersion it brings is anything but.

In a world where we pay hundreds of dollars for incremental improvements in framerates, complaining about gsync seems like missing the forest for the trees, IMO. It does everything it was advertised to.
 

Eggz

Distinguished


Corporations can actually get in trouble for NOT seeking to maximize shareholder value, whether through profit or otherwise. Mabram's point is also a good one, but I do think that Nvidia was probably happy to land the BenQ contract. That said, increasing shareholder value and customer satisfaction aren't mutually exclusive. In fact, customer satisfaction often increases shareholder value.



Gsync is pretty kick-ass, and it also brings VESA's adaptive sync technology to market faster than normal. People complain that it's proprietary, but I think having a proprietary solution is good when it accelerates an otherwise slow adoption of useful technology. Think of DisplayPort 1.2. It's already getting phased out, but it has been so slow to make its way into the market that most people still don't have it. The most used display connector continues to be HDMI. Why? Well, in large part because it's a proprietary solution - just another example of how certain proprietary tech can in people's favor.

The corporate marketing funds spent on implementing proprietary tech gets the word out, which spurs adoption, increases production, and drives down prices. Maybe I'll change my mind about Gsync once VESA's adaptive sync becomes a standard feature in monitors, but for now, Nvidia is justified in charging their price premium.
 

LoneGun

Distinguished
Apr 16, 2013
25
0
18,540
G-sync/free-sync unless you suffer from screen tearing its just a gimmick spend the money on a faster gpu and dump sli. The only time I've ever seen tearing was on sli systems of which I've owned about three. Proper 4k cpu and SINGLE gpu solutions simply can't produce the fps required for gsync to be worthwhile. This was released too soon 4k monitors are being released too soon current cpu/gpu configurations can't keep up without TWO or more gpus and ofc THAT generates a market for gsync.
 


...I'm not sure you understand entirely what asynchronous driving of monitors does. Yes, part of it is that it removes screen tearing (which, by the way, is very noticeable on single-gpu setups too; my 670 has noticeable tearing because I absolutely hate v-sync.)

However, the other part of what it does is allows a significantly lower framerate to feel just as smooth as a higher framerate without it. Which means that a single GPU setup that can only pull about 45 fps is going to feel like what we currently have at ~70 fps. What that means is that G-sync and A-sync actually work better when we're getting framerates lower than the maximum refresh rate of the monitor - above that and they don't do anything for tearing and fluidity.

I'm not exactly sure who you think is responsible for releasing 4k monitors and gsync too soon - that's a kind of silly thing to be upset over. Causing scenarios where a technology can't keep up is the only situation in which we as consumers are going to see a solid improvement in graphics technologies.
 
Status
Not open for further replies.