Acer XB280HK 28-inch G-Sync Ultra HD Gaming Monitor Review

Status
Not open for further replies.

Frozen Fractal

Distinguished
May 17, 2015
96
0
18,630
On first glance I thought I wouldn't have to see "we will be reviewing XB270HU soon" quote again. But then I realized it's XB280HK. Oh well, guess have to endure that quote for few more time :rolleyes:

Contrast ratio, brightness, chromacity & gamma tracing is where XB280HK looses the ground, but to be fair, most of the gamers won't be noticing much difference at all. But it is kind of disappointing to see Planar do better in these fields than Acer utilizing the same panel. I don't know, maybe the overdrive somehow worsen the results?

But ofcourse, it does well on uniformity and response time. Makes me wonder why XB280HK doesn't have ULMB if it's supposed to be a bundled feature with G-Sync. That should've helped in 60Hz panels more, rather than 144Hz ones.

But anyway, XB280HK looks promising, although I don't think 4K is what I prefer for gaming+life (although I do for gaming only).
 

Frozen Fractal

Distinguished
May 17, 2015
96
0
18,630


It's 1.2a I presume. Since that's what is capable of 4K@60Hz other than HDMI 2.0
 

picture_perfect

Distinguished
Apr 7, 2003
278
0
18,780
Why do they keep pushing 4K for gaming. True gamers have always regarded fps as king and 4K is one-quarter the frame rate of 1080. Gamers don't need expensive 4K monitors driven by expensive cards at ever-lower frame rates (via G-sync). This is chasing the proverbial tail and counterproductive. Regular 1080p, v-synced at a constant 144 fps would be better than all that stuff.
 

spagalicious

Distinguished
Why do they keep pushing 4K for gaming. True gamers have always regarded fps as king and 4K is one-quarter the frame rate of 1080. Gamers don't need expensive 4K monitors driven by expensive cards at ever-lower frame rates (via G-sync). This is chasing the proverbial tail and counterproductive. Regular 1080p, v-synced at a constant 144 fps would be better than all that stuff.

*Competitive Gamers
People that like to play games also like to play games in ultra HD resolutions.
 


ULMB uses flickering to lower persistence, which reduces the motion blur. If you've ever used 60hz CRT monitors, you'll know that flickering is painful on the eyes. This is why ULMB mode is not offered on 60hz monitors, and likely won't be offered on anything less than 75-85hz.

 


Top end GPU's can handle 4K just fine. You just don't play it at max settings. What is better, medium to high settings and 4K, or maxed at 1080p? That is a subjective question, and will vary from person to person.

That said, I prefer higher refresh rates than 60hz, so I'll be going 1440p before 4K.
 

picture_perfect

Distinguished
Apr 7, 2003
278
0
18,780
Top end GPU's can handle 4K just fine

Not really. It depends on what you want. Even a $1k card will stutter half the time (drop below 60 fps) in current single player games. In multiplayer where fps matter most, rule of thumb says expect half the fps so forget it (for me). G-Sync/FreeSync are great for these crappy, laggy frame rates but not when used to promote said crappy laggy frame rates, which is what's happening with 4K. PC friendly resolutions (with readable icons/text/higher fps) still have benefits right now. You might get them with a 4K set-up by dropping some eye candy, scaling the resolution/text ect but then why spend the money in the first place.
 


You clearly didn't read my post, which was only 4 sentences long.

All you have to do is play with reduced settings, or not be playing the latest AAA games. I don't understand why people ignore that PC games have graphical settings.
 
I have to agree that 4k is not quite ready for prime time and won't be even on 2-way SLI / CF rigs until, my guess ..... around Xmas 2016. ULMB is kinda worthless at < 60 fps..... G-Sync does the job well for up tp 60-70 fps but at that point, you'd want to switch to ULMB. Putting it on a 60 fps monitor would serve no benefit. Witcher 3 on Ultra gets 60 - 80 w/ twin 970s which leaves you on the border between ULMB and G-Sync @ 1440p. Either works well but I didn't get to play enough when visiting my son (Acer Predator XB270HU) to form an opinion as to what was better.

In order to deliver 4k at 144 Hz, you'd need more that Display Port can currently offer so even if today's cards could produce it, we don't have a cable technology to deliver it. Conversely, I don't see monitor manufacturers looking to incorporate DP 1.3 until we have GFX cards that can routinely deliver > 60 fps @ 144 Hz at 4k.



 

soldier44

Honorable
May 30, 2013
443
0
10,810
Yet 28 inches is too small for 4k. People blabbing about 144hz this and that. Not everyone wants to play games that look like soap operas. Some of us prefer larger higher res displays. I'm currently looking at 40-43 inches at 4K with 2 980 kinpins to push it. I could give a rip about 144hz, those displays are just too small.
 
TN panel is a problem as i see it. Its 60hz so why not at least a VA panel? I recently went from a TN panel to a VA panel and would never go back to TN. And to those saying 4k is too much, or requires expensive graphics cards are a bit wrong. Benchmarks at 4k are often tested at Ultra detail presets, which arent necessary at 4k, especially not AA. And if a game doesnt work well at 4k, who cares, it scales perfectly at 1080p, so you just run the game at 1080.
 

picture_perfect

Distinguished
Apr 7, 2003
278
0
18,780
You clearly didn't read my post, which was only 4 sentences long.

All you have to do is play with reduced settings, or not be playing the latest AAA games. I don't understand why people ignore that PC games have graphical settings.

I read it. If you had read mine, the last sentence sums up the irony in your argument. The trade-offs with 4K are not for me (or competitive gaming) but weight them according to your needs.
 

It is a subjective choice, but you do realize that playing at a lower resolution to get the FPS you want, is not different than lowering a few settings. Both effect IQ, and lets be honest here, the difference between high and ultra on most games these days, is hardly noticeable.

I mostly objected to you about how 4K means you are going to have a stutterfest. Not that your choice is not a reasonable one. 4K is not a stutterfest if you just lower a few settings on few games which need you to lower settings. Which is another point, most games are not the ones you see in the benchmarks. Those are just the most demanding ones which are great for testing the limits of a GPU. If you look at the average game, they don't require nearly the GPU power.
 


1. There's no such thing as a 4k 144 Hz display....

2. I am not aware of a single 4k monitor > 32"

3. A total of 7 people in 10,000 game at 4k (0.07)%




Generalizations about panel types can not be made w/o considering which panel and what's in the monitor besides the panel. The Acer Predator is an IPS panel that is gorgeous in gaming but w/o G-Sync, ULMB and 144 Hz, it would not be the same monitor..... When we say IPS, the IPS panel in the Predator is a far cry from the ghost heavy $300-ish panels. Because the Predator is great, doesn't make every IPS monitor great.

And when we say TN for example, are we talking the 6 bit color found in most TN panels of the 8 bit in the Asus Swift ?

As for using a 4k monitor at 1080p..... if you're gonna be playing at 1080p, why not just get 1080p ? Kind alike buying a Ferrari to commute to the city from the suburbs.
 
This review compares G-sync monitors of many resolutions. The monitor left out of the comparisons in this article of which Tom's already has measurements and a review is the Asus ROG Swift PG278Q.

Just about every aspect of the Asus ROG Swift monitor measures better than this monitor (other than resolution) yet it earned no accolade from Tom's (e.g. "Recommended").

Why no love for the one that started it all?
 
The Swift was **THE** gaming monitor until the Predator arrived, well the G-Sync version anyway. While the benefits of IPS usually always include more accurate color little was ever said about the fact that IPS panels generally had at least 8 bit color while the typical TN had 8. The Swift had 8 bit so that erased much of the difference right there. The Predator was the 1st IPS panel I ever recommended for Gaming and, from what I have seen since, there's little that competes with it other than the Asus MG279Q (no ULMB).

 
There's no "perfect" gaming monitor yet but IMO the best gaming monitor is the Acer Predator.

1440p, 27", 4ms, IPS, GSYNC

It's not 4K but it can go over 60Hz which is better, and you won't into issues such as hitting the 60FPS ceiling (guess what happens then... ).

IPS can cause ghosting but the 4ms response minimizes that. I'll take a small amount of ghosting for the benefits of IPS.

Adding in light strobing at the SAME TIME as GSYNC is the next step...

*4K gaming at 60Hz might be okay if you can lock to 60FPS while also still using GSYNC but I'm fairly sure that's not easy to do.
 

deuce_23

Distinguished
Nov 18, 2009
63
0
18,630


As Tim Allen said more power ruff ruff
you can never have to much power
 

deuce_23

Distinguished
Nov 18, 2009
63
0
18,630


Is there any idea when this perfect monitor might come out. I want to start putting money away.
How far away is 4k at 120Hz or is it the display port connector holding it back?
Once pascal hits next year it will be 4k gaming for me.
 


Going from 60Hz to 120Hz for 4K is going to be problematic. Unless you're putting in multiple panels and joining them which brings their own issues for quality and working with GSYNC I think you'll be waiting a while.

I have a 1440p, IPS display and while 4K in some games might be slightly sharper it's definitely not worth the cons associated.

Seriously, I'm in CIV5 looking at tiny, sharp text at 1440p. Spending more on a GPU to run 4K or making other sacrifices like refresh rate, or turning down the quality to run a high enough refresh make no sense to me.

I recommend you get the best panel for gaming and that's likely to remain a 1440p IPS, low response time, GSYNC monitor. Take the Acer Predator and get lightstrobe or some other method to reduce blur a bit more and I think that's likely to be the ideal monitor for the next two years or so.
 
Status
Not open for further replies.