Acer Predator XB271HK 27-inch UHD G-Sync Monitor Review

Status
Not open for further replies.

chumly

Distinguished
Nov 2, 2010
647
0
19,010
17
Bummer about the response times. The other Predators, the x34, z35, and the smaller 16:9 XB270 variants were all on point (less than 15ms absolute input lag).

The world is still not ready for 4k (but getting closer). 2 more years, 2 more years.
 

chumly

Distinguished
Nov 2, 2010
647
0
19,010
17
actually now that I'm comparing tom's review to another review on the same monitor, the response time numbers are not adding up. lag of 16 - 32ms, is one to two frames of lag at 60Hz, according to TFT central, they call this class 2; class 1 (less than 16ms) being optimal for gaming. They have this same exact monitor with only 4ms between signal process and pixel post (much much faster). What's up? Your response time numbers seem to be a lot higher than they should be.
 

picture_perfect

Distinguished
Apr 7, 2003
278
0
18,780
0
We think motion quality is more important than resolution. When you move up to 75Hz and beyond, things like motion blur fade into the background. Those kinds of artifacts no longer distract from gameplay. With adaptive refresh, tearing is a thing of the past at any resolution, but we’d still rather have those high framerates. So do those extra pixels make up for this? We’d have to say no at this point.
YES. This is an opinion gamers need to know before buying a 4K monitor and one that has been missing from your previous reviews. KUDOS for finally dishing out some common sense. These resolutions are too high.
 

eklipz330

Distinguished
Jul 7, 2008
2,992
0
20,790
1

that prediction falls in line with mine... but for me, that means it's an excellent time to buy a "stop gap" monitor now.

3440x1440, 120hz, OLED HDR 34" monitors with low latency to be a thing in 2-3 years. but until we have the hardware to drive that resolution, it makes no sense to wait. i think a 35" 2560x1080 144hz VA panel is amazing right now (for gaming).
 

Sam Hain

Honorable
Apr 21, 2013
366
0
10,960
60
For me (and this is just my opinion and gaming needs), I cannot justify 4K 60Hz, regardless of price-point, monitor size, response times, manufacturer, etc. at the moment...

Perhaps 4K w/GSync, hitting 100Hz and we have a winner... But of course, then comes that killer price tag.
 

ubercake

Splendid
Moderator
I had this monitor for a day and returned it because of the backlight bleed. It was equivalent of the poor viewing angles you see on a TN monitor because the bottom right corner was very bleached out unless you physically moved your head to center on the bottom right corner.

I then picked up a PG279Q which has less backlight bleed.

I found both monitors to have great performance, but found the backlight bleed on both to be distracting.

I really don't think either Asus or Acer is where they need to be when they charge $800 for a 1440p IPS monitor. These panels may perform well, but they are not great IPS panels from a backlight bleed standpoint. They should be priced around $500 because of the low-end IPS panels. You know what I mean if you've used a good IPS panel.

Hopefully, they'll stop ripping people off one day.
 

Sam Hain

Honorable
Apr 21, 2013
366
0
10,960
60
In your XP, would Asus build-quality stand as being "better" or does another manufacturer in this realm of gaming-spec monitors stand out in your opinion? I'm not in the market yet... BUT am getting close. Thanks!

 

AlistairAB

Honorable
May 21, 2014
216
54
10,760
0
At bestbuy.ca this monitor is $1288 after tax in Canada and the LG 27UD68 is $616. Swap freesync for gsync and save over 600 dollars and you get the same quality panel (or is the LG a higher quality one?) I wish I could spread the gospel of the LG monitor faster, as hardly any sites are mentioning the first inexpensive freesync 4k monitor has arrived, making the price points of all the older ones obsolete.
 
LG27UD68->
It's useless for Freesync though. It has the very problem mentioned in the article as it does NOT support Freesync below 40Hz.

AMD is really doing themselves a disservice by not enforcing the 2.5X min asynch range to ensure Freesync works properly. For example, 30Hz to 75Hz is okay because the AMD driver can resend the GPU frame to ensure you stay in asynchronous mode, but 30Hz/40Hz to 60Hz is not okay.

GSYNC does not have this issue on any monitors at all. Even on laptops where they don't need a GSYNC module (because they know the panel specs) they still ensured a 30Hz to 75Hz range so their driver equivalent to AMD's method works properly.
 

BlueRaidervol

Commendable
May 10, 2016
4
0
1,510
0
I've had one of these monitors for a couple months now and I'll have to say that I've been extremely impressed. The thing is built like a freaking tank and I've not had any problems with backlight bleed (thankfully). I'm using an EVGA 980ti SC+ with a 5820K and I haven't had any problems in the games that I play with low framerates. I was debating between the XB271HK @ 4K and the 1440P @ 144hz Predator and ultimately decided I wanted to go with 4K because everything is headed there. It's a wonderful monitor for both gaming and working (I use Word ALOT).
 

AlistairAB

Honorable
May 21, 2014
216
54
10,760
0
Photonboy:

LG27UD68->
It's useless for Freesync though. It has the very problem mentioned in the article as it does NOT support Freesync below 40Hz.

-------------

Read what you wrote again and see if that makes sense. So the Acer monitor works better with G-sync with frame drops below 40fps. I never ever want to play a game below 40fps. I set my settings to average 45-60 and I am perfectly fine. I could get 2 LG monitors for the price of one of these Acer monitors, or I can get a feature I'll never use?

It's important to educate the consumer on the differences as Tom's has done, but also point out when the differences are meaningless for 99% of people. I'd prefer that the LG monitor supported 30-60 of course, but it is not essential. Most people will set their average as close to 60 fps as possible, and the adaptive sync will remove the judders that v-sync would have caused when the frame dips slightly below that.

That doesn't sound like the definition of "useless".
 

AlistairAB

Honorable
May 21, 2014
216
54
10,760
0
It's important to point out that freesync really is a free feature unlike the $672 CAD extra for G sync and a better stand that the Acer has. You can't buy a standard 4K IPS monitor for less than the LG one anyways.
 

razorwindmo46

Distinguished
Dec 22, 2007
31
2
18,545
1
Bought this over 6 month's ago and blows my 27" 1080P away. Unless your brain can compute lag, you won't even notice. I picked this monitor over over 4K monitors. Absolutely stunning performance. Running all newer games at 2560 X 1440 @ 144hz
 

razorwindmo46

Distinguished
Dec 22, 2007
31
2
18,545
1
oops, sorry for my first post. I meant to post that I don't have the monitor in this review I have the Acer Predator XB271HU 27" WQHD IPS G-Sync Display Monitor.
 

ubercake

Splendid
Moderator

I kept the Asus PG279Q. It had less bleed, but still was bad compared to non-G-sync IPS or PLS monitors I've used for work. Both monitors have great gaming performance when the colors are constantly changing around the screen, but the Acer had a silver cast to the bleed (almost mirror-like) that was distracting especially in dark scenes.

Both of the monitors have the heaviest backlight bleed in the bottom-right corner, with the 2nd most bleed occurring in the top right corner. The top and bottom left also have it, but it's negligible compared to the right side.

Think of it this way... An IPS screen is supposed to offer better viewing angles, colors and a better picture overall. With the PG279Q and the XB271HK (both IPS) I can honestly tell you the only thing better about them than the PG278Q (TN) monitor I had (when viewing from head on while gaming) - aside from the resolution of the HK - is the contrast. Viewing angles become an issue on these IPS monitors because of the backlight bleed that you see while viewing head-on, so that's a wash. At least with the TN monitor, I didn't have any color variation until I moved away from centered on the monitor.

Realistically, these monitors are the best in their class because there aren't many others in their class (IPS + G-sync and 4K or IPS + G-sync and 1440p).

At any rate, the backlight bleed is definitely not what should be the norm in an $800 monitor. I'd say wait for gen 2 of these monitors next year or the year after.
 

TheJake

Commendable
Jun 26, 2016
1
0
1,510
0
"an excellent new G-Sync monitor has just appeared from Acer"

The XB271Hk has been out for half of a year. It was available in December of 2015 which I picked one up at the time.

Bottom Line - It's one of the best 4K G-Sync monitors you can get folks. It is plenty fast if you have the right hardware to back it.

I'm going to dig a little deeper here about the industry and how this monitor fits into it. I'll even show you how to game at 8K with this monitor below. This is gonna be long though so I warned you ahead of time. :)

I've been gaming for over 30 years and the industry has changed a lot since then. Right now we have two different camps. I categorize the two as one side is about visual "quality" and one is about visual "quantity". For many years we had the introduction of 3dfx, Ati, Matrox and later the old Riva card that changed it all for team green. As the cards continued to become more advanced we gamers were all about what are the next graphical enhancements. We paid top dollar for a graphic card that would push the boundries visually. "I want to see a game as real as a Pixar movie!" That has been how the market grew.

Today, however, the monitor types have exploded like TN which can push the monitors hz levels very high and a shift in speed vs quality has arisen. I get the 120+ hz fans. When you see blades on a zepplin spinning at such a higher rate than 60hz it's one of those moments where "once you see it you cannot go back". Raise of hands? Yeah, you know what I'm saying. But the same can be said for 4K if you have the right hardware to push it (Titan X single or SLI). If you are playing the new Doom on a console, for example, at 1080p you could have teleported from a blue stone teleporter pad into a new room. You go up the curved staircase and look down at the blue stone teleporter pad. It looks like a plain blue stone. Nothing special. At 4K you look down and that stone is loaded with demoic symbols etched deep into the stone. That's 4K. It brings out the finer details to the objects and their environments to a high level of clarity.

If you are running at top speed and you are solely about running and gunning 4K is not for you. It's all about speed. You're too busy running fast and it's mostly all about first person shooters. But do you play other games where you take pause to enjoy what the artists have created? Do you soak in, graphically, how much is in front of your eyes? 4K is for you. I want to enjoy my shooters but I also want to enjoy all the details maxed out to see all the intricate details from the high end cards and what they can really push. I want to enjoy all the enhancements of The Division maxed at 4K when I'm walking the debris laden streets or enjoy the glow and halo of Christmas lights strung out throughout a square at night as steam pours out from the gutter at night dragging me back into reality that this is a zone I better be warry of. I want to walk into the Vanishing Of Ethan Carter and step along the train tracks and watch the small dust and debris particles blowing over those tracks as they fall into the water below or get up close to the blood streaked train to examine the scene in fine detail. I can stop by the wide dam and take in its breadth and snap a few photos for my desktop later feeling like Ansel Adams capturing the best picture perfect photos I've ever taken in my life. That is why 4K is for me. In reality would I walk through life and not visually take anything in and only focus my attention to objects or the environment if it was only in an arthouse? Yeah, too deep for that one. :)

As the market has shifted a large number have chosen the speed and at the cost of reduction to the quality of games. Personally to me, why spend 400 to 800 dollars on a new video card when developers can just dumb down the quality and there are no need for a new graphics card. Everyone can play at Counterstrike quality graphics for the next 10 years. Counterstrike is still today a widely popular fps. This is where is gets messy because they want good speed but development has to dumb down those great E3 demos you saw before so that we can see these numbers. I feel in a lot of ways it's like the "tablet era" all over again. The market follows the cash. A lot of people jumped on tablets and Candy Crush quality games became the new norm for the mass market setting us back for a time which developers cashed in on the large market share of casual gamers using low end hardware and thus creating low end graphic games. If the PC market continues to shift more towards speed then graphically we are taking a step back in the technology we are trying to advance.

Do we want to continue to deliver speed or do we want to deliver quality? Will we reach the end of this post? Guess what - NONE OF IT MATTERS! Yeah, I'm Damon Lindelof arriving at the final episode of Lost. Initially, the XB271HK was designed to run at 75hz but it didn't work. Now with Nvidia's 1080 line the video cards can run 4K at 120 hz and at even 5K. Acer will have an XB271HK exact model but with 120hz attached to it. Thus the gap of 4K and 120hz can now come together to unify the two camps. Here's the catch - the 1080's now don't work good for 4K as I'm sure many of you have read. Once the Ti and Titan versions come out we will finally see this a reality. As of today though, you can play at 8K if you want to check it out. I'd also like to point out something to take with you for the future of gaming AND home entertainment - anything beyond 8K and you won't see any difference. Unless your eyesight is 20/20 your eyes will not be able to differentiate the quality at 16K. Don't get ripped off later from beyond 8K+ televisions or monitors. Save your money. Anyways, if you want to check out 8K you need this monitor and a high end card along with Battlefield Hardline. You can double the current resolution "4K" through their video settings and play it at "8K". It runs slow but at the beginning you can plow through the door and get up close to the guy you are about to arrest. You get up close enough to see every tiny hole through the weave of yarn on his sweater. That's the detail of 8K! If you keep running it, the game looks amazing but it is dirt slow. Since this is a Dice product I believe you can also crank up 8K in Mirror's Edge Catalyst. Haven't tried it yet but I think you can do that one too. :)
 
Please stop comparing Freesync and G-Sync as equivalent technologies where one is free and the other isn't. Yes, of course this monitor doesn't do well in motion tests ... a) it's a 4k monitor and no 4k monitor exists that can do high frame rates ... b) I have yet to see a cable that can carry the bandwidth.

No, the Freesync and G-sync packages are not the same thing. Aside from the technical issue described above, what G-Sync has that Freesyc doesn't is a hardware module that strobes the backlight to provide motion blue reduction (ULMB). As these is a cost associated with building and installing this module, the cost is passed onto the consumer. This is exactly the case with Freesyc monitors where the monitor manufacturer has added their own strobing solution and here again, we see a cost increase.

Low motion blur and 4k are simply mutually exclusive. Freesync / G-Syn work very well and in similar fashion up to about 70 fps. If you can manage better than that, Freesync is a dead end because it does not have a MBR hardware module, but on a G-Sync monitor, you can turn off G-Sync and operate in ULMB mode. Spend 30 minutes playing on a Freesync monitor and then spend 30 minutes playing on a G-Syn monitor with ULMB and then you will see why G-Sync equipped monitors (and those Freesync monitors whose manufacturers have added the strobing module themselves) cost more.

Yes, as long as you are going to be stuck at 60 Hz, you are going to be saddled with motion blur and slower lag times. 4k resolution is simply "not ready for prime time". What 4k needed was Display Port 1.4 which has now arrived on the scene with nVidia's 10xx series. Now that we have, finally, GFX cards that, in SLI, can drive 4k consistently > 60 fps with a DP1.4 interface, no monitors or cables exist that can deliver the bandwidth necessary for 4k to deliver the goods.

One would expect that with DP 1.4 cards now in use, 4k monitors with DP1.4 would arrive right on their heels. Getting 4k because "for the future" ... bad idea. When they do, every 4k panel out there today will be obsoleted.

It was mentioned that "backlight bleed" led to this monitor being returned. That's called "IPS Glow", an inherent feature of **all** IPS panels.
 
G

Guest

Guest
Are there any hdr monitors out yet? Was ready to buy a 10-bit 4k monitor, then learned it wasn't hdr. Supposedly 10-bit hdr 1080p is better than 10-bit sRGB 4k, and I believe it.
 

zthomas

Honorable
Sep 13, 2013
194
0
10,690
3
God my acer xb270h replacement has arrived.. in a 27 flat screen.. Predator had me drooling for the 34 curved screen.. but now the 27 has hit the sweet spot.. the price is a bit high.. black bled what? One thing ya gotta fool with the white and tone way down.. I never play on the 3D mode.. I got the right glasses but you know..

I got mine in December 15 for around 500.00 US .. the 270 has since jumped up another 200 bucks.. and now the predator 800+. I say buy now why wait..
 

Sam Hain

Honorable
Apr 21, 2013
366
0
10,960
60
Appreciate your input(s), thanks!

 

ubercake

Splendid
Moderator


Thanks for clearing up that terminology for me. The similar (but different) BLB would apply to a TN monitor, eh?

These IPS gaming monitors still exhibit a lot more of the "IPS glow" than their non-gaming IPS counterparts.

Additionally, if anyone wants to get an idea of the "IPS glow" I experienced (also referred apparently incorrectly in another thread), check out the image I posted of the HU I had before I had the HK that was similar:
http://www.tomshardware.com/forum/id-2982820/backlight-bleed-big-deal.html#18200370

I have a 1080p LG IPS television that ran me $600 that has none of the issue whatsoever you see in the four corners of the monitor in the photo in the thread I referred to.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS