Acer Predator XB321HK 32-inch Ultra HD G-Sync Monitor Review

Status
Not open for further replies.

Bartendalot

Distinguished
Apr 18, 2010
174
0
18,690
The nite about 4K@60hz being obsolete soon is a valid one and makes the purchase price even more difficult to swallow.

I'd argue that my 1440p@144hz is a more future-proof investment.
 

Yaisuah

Commendable
Jul 28, 2016
2
0
1,510
I have this monitor and think its great and almost worth the money, but I want to point out that nobody on the internet seems to realize there's a perfect resolution between 1440 and 4k that looks great and runs great and I think it would be considered 3k. Try adding 2880 x 1620 to your resolutions and see how it looks on any 4k monitor. I run windows and most less intensive games at this resolution and constantly get 60fps with a 970(around 30fps at 4k). You also don't have to mess with windows scaling on a 32in monitor. After seeing how great 3k looks and runs, I really don't know why everyone immediately jumped to 4k.
 

mellis

Distinguished
Jun 17, 2011
25
1
18,535
I am still going to wait before getting a 4K monitor, since there is still not a practical solution for 4K gaming. In a couple of more years hopefully 4K monitors will be cheap and midrange GPUs will be able to support gaming on them. I think trying to invest in 4K gaming now is a wait of money. Sticking with 1080p for now.
 
I would hazard a few guesses. First would be that 2880x1620 is so close to 2560x1440 that no manufacturer wants to complicate product lines like that.

Second, 2160 is the least common multiple of both 720 and 1080, meaning it's the lowest resolution that's a perfect integer scalar of both. So with proper upscaling, a 720 or 1080 source picture can be displayed reasonably well on a 4K display. These panels are made for TVs as well as computer monitors, and the majority of TV signal ( at least in the US ) is still in either 720p or 1080p. Upscaling 1080 to 1620 is the same as upscaling 720 to 1080 ( they're both a factor of 150% ). Upscaling by non-integer factors means you need a lot of pixel interpolation and anti-aliasing. To me, this looks very fuzzy ( I bought a 720p TV over a 1080 TV years ago because playing 720p PS3 games and 720p cable TV on a 1080p display looked horrible to me ). So it may be the powers that be decided on the 4K resolution so that people could adopt the new panels and still get decent picture quality with the older video sources ( at least until, or if, they get upgraded ). If so, I can agree with that.
 

michalt

Distinguished
Jun 20, 2010
18
1
18,515
I have one and have not regretted my purchase for a second. I tend to keep monitors for a long time (my Dell 30 inch displays have been with me for a decade). When looked at over that time period, it's not that expensive for something I'll be staring at all day every day.
 

gulvastr

Commendable
Jul 29, 2016
2
0
1,510
RivaTuner has a great global FPS lock option. I set mine at 95 FPS on my X34 and stay in Gsync at all times.
 

Sam Hain

Honorable
Apr 21, 2013
366
0
10,960
4K at 60Hz for $1k+ does not "compute" and is not a one size fits all for all users; speaking for gaming here. Graphics applications and photo/video editing, it's all yours!

A 1440p 120Hz/144Hz gaming monitor of your particular flavor (IPS, TN, AMVA/Flat, Curved/Single, Multiple, etc; You get the idear folks) is of a more logical and performance oriented enterprise for the foreseeable future for gaming and has been since the first 4K display that debuted as a "gaming" monitor.

1440p gaming monitors come in under less than 4K "gaming" monitors, traditionally. BETTER performance, no questions asked and are less taxing on your GPU(s). For those with higher-end GPUs, crank up those settings to Ultra and Max, ESPECIALLY with the GTX 1080 now in action and the soon to debut MONSTER, Pascal Titan X.

No matter how you slice it, SLI it (your rig)... 4K at 60Hz is still 4K stuck at 60Hz. No thank you.
 

Sam Hain

Honorable
Apr 21, 2013
366
0
10,960

Does not make sense to "downscale" from 4K to any resolution... I see what you are saying but, like RedJaron stated it's so close to 1440p, it's negligible.

The optimal way to for someone to use DSR is with a lower rez monitor going up... otherwise, money ill spent on a 4k $1K monitor. Just my opinion.
 

Yaisuah

Commendable
Jul 28, 2016
2
0
1,510
1440 and 1620 might seem close, but when you do the calculations, its actually almost 1 million more pixels on the screen. The difference between 1080 and 1440 is 1.6 million, so I wouldn't see how 1 million could ever be considered negligible. Plus my point was more about how companies just jumped straight from 1080/1440, to 4k as the next goal, when 3k was the logical next step. On this monitor, 3k res looks like the native resolution because the pixels are so small and I really have to get up close to tell the difference between 3k and 4k. That might sound unbelievable, but its true, and the performance makes it worth the very slight decrease in quality. This monitor has made me realize 4k is very unnecessary right now, just try 3k out on any 4k and you'll see.
 
I know the math, but I didn't say it was negligible. I said mfrs probably consider it too small a jump to clutter their product lines by adding it. Yes, 1M extra pixels sounds like a lot when 1080p only has a little over 2M pixels. However, 1440p has 3.7M, so using the number of pixels alone doesn't tell the whole story. Look at the pixel increase from a percentage standpoint. 2880x1620 only has ~25% more pixels than 2560x1440. That's not a lot of difference considering going from 1080p to 1440p gives you more than 75% more pixels. That's about the same as moving from 1600x900 to 1680x1050. Most significant resolution jumps give you at least 30% more pixels, and that's the low side.

I get what you're saying. 480, 720, and 1080 were all a factor of 1.5 from each other. Why break trend now? And I'm not saying the downscaled picture doesn't look good to you. Just remember than some people have sharper eyes than others, and what looks fine to one can be a fuzzy mess to another.
 
I can't see buying a monitor where you pay a premium for the G-Sync hardware module but the monitor is too slow to use the ULMB feature that the module provides.

Until monitors arrive with DP 1.4 and panels fast enough ti support it, I can't seen investing $1200 for the "long term", then the technology will be obsoleted in a matter of months.

And lets please make a very important distinction ... while IPS panels do have better viewing angles and color than TN panels .... and very few of them are fast enough for gaming.
 

picture_perfect

Distinguished
Apr 7, 2003
278
0
18,780
DisplayPort 1.3 (approved in 2014) supports 3840x2160 signals up to 120Hz. Why we have yet to see monitors and video cards with this spec is anyone’s guess.

But even when graphics cards add DP 1.3, they’ll still need more processing power to enable those higher speeds

I think you answered your own question.
Graphics power is limited. It's true, it's true.
Otherwise ppl would be barfing much less in their VR headsets right.
And I don't think 4K 60hz monitors will be obsolete soon.
 
Next thing we see will be 1.4 ... the GFX cards are already here. Asus showed it's DP 1.4 monitor already at Computex ... should be on the shelves by January at latest

http://www.144hzmonitors.com/monitors/asus-computex-2016-27-inch-4k-144hz-gaming-monitor/
 

_MOJO_

Honorable
Jan 30, 2014
69
0
10,660
I have the ASUS ROG PG279Q - a hefty investment of cash but I can play my favorite titles at rates up to 165 Hz.The new 1070 and 1080 will push those buttery smooth frames to maximum with an IPS display , Gsync, and a solid 4 ms response time. I also use the monitor for art applications in tandem with a Wacom tablet for my workstation. I could not be happier with this versatile monitor that was almost half the price and twice the frame rate. 4K is adequate for multitasking, or graphic designs, but not synonymous with gaming at the moment in my opinion. The GPU manufacturers are almost there technologically, but the market only caters to the top tier financial demographic. A 10 series card and this monitor would set one back around $2000?! To push 4K to 60 fps- I'll pass.
 

rauf00

Commendable
Aug 11, 2016
2
0
1,510
from X34 standpoint (21:9 3440x1440 g-sync IPS) i will get next Predator ASAP when it will get +100Hz in 4K 21:9
 

_MOJO_

Honorable
Jan 30, 2014
69
0
10,660


Agreed. I have the ROG Swift which I use mostly for CSGO and Dog fights on War Thunder. I use my LG 3440x1440 Ultrawide for graphic design, video production, and Sid Meiers Civilization V or Total War Warhammer.

I think the ultimate gaming setup would include a 144 Hz - 165 Hz Ultrawide 3440x1440 personally. I have a feeling we will see it within the next year *fingers crossed*

4K at that frame rate is a ways off at the moment.
 

maddogcollins

Distinguished
Nov 20, 2011
10
0
18,510
Its a gaming monitor with 63ms lag and they say that shouldn't be a problem even for skilled players? Paying over a 1,000 with 63ms lag is a problem for me, a very average player.
 
Status
Not open for further replies.

TRENDING THREADS