Acer XR341CK 34-Inch Curved FreeSync Monitor Review

Status
Not open for further replies.

ohim

Distinguished
Feb 10, 2009
1,195
0
19,360
Why do they make screens brighter than 300 or let`s say max 350 cd/m² ? I had tested a Iiyama 27" with 350 and at max brightness i could feel the heat from the screen on my hands and face at 30 cm away from the screen ... you can`t even use the blessed thing at max.. i finally settled for around 20-30% of screen brightness max 50% for presentation purposes ...
 

rene13cross

Honorable
Apr 2, 2012
485
0
10,810
Better the screen is too bright and brightness can be reduced than it being too dark and brightness cannot be increased. Nothing to complain about in my opinion.
 

DoDidDont

Distinguished
May 27, 2009
81
0
18,640
Seriously considering two of these for work/play, but holding off as the Asus ROG PG384Q 34" curved monitor looks more promising, G-sync, IPS, and 3440x1440 @ 100Hz. All come down to when Asus will release it, and how long I can wait...
 

Larry Litmanen

Reputable
Jan 22, 2015
616
0
5,010
I saw a similar monitor in the store recently and i have to say yes it is wide, but in height the display is too short, they need to add a few more inches to height so the experience is very immersive.
 

PlanesFly

Distinguished
Mar 28, 2013
8
0
18,510
Great Review guys. Just one thing, you need to remove the reference to being a Predator monitor, this monitor has absolutely no tie to the Predator line-up of products, it is a XR Series monitor. Only the G-Sync version falls under the Predator series.
 

cknobman

Distinguished
May 2, 2006
1,123
267
19,660
$1000? LOL No. They have already proven in the TV market that this curved crap adds nothing to the experience, and even detracts from it. It is not worth any type of premium whatsoever.

$500, Yes
 

PlanesFly

Distinguished
Mar 28, 2013
8
0
18,510


You can't just add to the height or the aspect ratio wouldn't be 21:9. The height is equal to a 27" 16:9 monitor with 30% more width.

You're looking for a monitor with 3820x1600, essentially a 4k monitor with the vertical chopped off.
 

obababoy

Honorable
Jul 24, 2013
55
0
10,640
Seriously considering two of these for work/play, but holding off as the Asus ROG PG384Q 34" curved monitor looks more promising, G-sync, IPS, and 3440x1440 @ 100Hz. All come down to when Asus will release it, and how long I can wait...
What GPU are you running? if you have AMD get this, if you have Nvidia get the ROG...but wait for however long it will take.
 

PlanesFly

Distinguished
Mar 28, 2013
8
0
18,510


Why should someone wait for the ROG, they will be using the exact same panel and the Acer X34 has already passed both reviews I've seen so far with flying colors. The only difference is if you desperately want the slightly different aesthetic of the ASUS version over the Predator.

The X34 should be out in a couple weeks...the ASUS won't be out until possibly Jan 2016.
 

obababoy

Honorable
Jul 24, 2013
55
0
10,640
Why do they make screens brighter than 300 or let`s say max 350 cd/m² ? I had tested a Iiyama 27" with 350 and at max brightness i could feel the heat from the screen on my hands and face at 30 cm away from the screen ... you can`t even use the damn thing at max.. i finally settled for around 20-30% of screen brightness max 50% for presentation purposes ...
Most calibrations lower the brightness of screens but if you work in an office or bright space I could see people using a higher brightness. Would rather have the option than not no?
 

obababoy

Honorable
Jul 24, 2013
55
0
10,640
That's silly, why should someone wait for the ROG, they will be using the exact same panel and the Acer X34 has already passed both reviews I've seen so far with flying colors. The only difference is if you desperately want the slightly different aesthetic of the ASUS version over the Predator.

The X34 should be out in a couple weeks...the ASUS won't be out until possibly Jan 2016.[/quote]


Which is why I asked what type of GPU he has...the Asus is a G-sync monitor and the Acer is a Freesync. They have the same panel but different scalers and are/should be proprietary to either AMD Nvidia.
 

gamertaboo

Distinguished
Sep 23, 2015
65
2
18,665
I guess what I don't get exactly, is why are they making so many Free Sync monitors? Aren't the majority of gamer's with Dedicated GPU's using Nvidia? It's just kind of frustrating... can't they find a way to make the monitor support both? It seems like the technology should be fairly similar...
 

Grognak

Reputable
Dec 8, 2014
65
0
4,630
They can't make a monitor with both Freesync and G-Sync because the first is an open, VESA standard while the other comes with a fee - they're legally incompatible. The technology isn't entirely similar either since G-Sync needs (needed?) dedicated hardware, hence the $100-$200 premium on their monitors.
 

obababoy

Honorable
Jul 24, 2013
55
0
10,640
I guess what I don't get exactly, is why are they making so many Free Sync monitors? Aren't the majority of gamer's with Dedicated GPU's using Nvidia? It's just kind of frustrating... can't they find a way to make the monitor support both? It seems like the technology should be fairly similar...
Or better yet maybe Nvidia wouldnt be so stingy and only use their g-sync scalers in their approved monitors...Freesync is an open standard and G-sync is not. Nvidia could use Freesync if they allowed it in their drivers..But at the end of the day ill get called an AMD fanboy because I want everyone to not have to split up monitors, and games because Nvidia develops a bunch of proprietary game features and monitors...
 

WhyFi

Distinguished
Apr 12, 2006
114
0
18,680
Why do they make screens brighter than 300 or let`s say max 350 cd/m² ? I had tested a Iiyama 27" with 350 and at max brightness i could feel the heat from the screen on my hands and face at 30 cm away from the screen ... you can`t even use the damn thing at max.. i finally settled for around 20-30% of screen brightness max 50% for presentation purposes ...

There's an off chance that someone is going to be working in an environment where the additional brightness is useful... but I think that the main reason is so they can slap insane (aka not realistic) dynamic range figures on to the spec sheet and on the box.
 
G

Guest

Guest
400-600$, I'd buy it. 1000$ no way.

Screen ratio is incorrectly listed as 16:9. I wish Tom's would hire some proof-readers instead of relying on it's readers to correct the many mistakes that appear in practically every article.
 

picture_perfect

Distinguished
Apr 7, 2003
278
0
18,780
No good for gaming. Photography maybe. Anybody planning to game, keep in mind you'll average 40fps with the very best card in current games. Resolution is too high with this monitor....again.
I'm going to say this now. This is Freesync / G-sync ABUSE straight up! Why? G-sync and Freesync were designed as cosmetic fail-safes for the occasional frame rate drop. Most gamers know you never want a low frame rate, but G-sync is there to just make it look good when it happens. This monitor ASSUMES 40fps and depends on Freesync to make you think otherwise. It's an abuse of the technology and a feeble gaming experience. Yes I think very feeeeeble. Resolution too high.
 

alextheblue

Distinguished
No good for gaming. Photography maybe. Anybody planning to game, keep in mind you'll average 40fps with the very best card in current games. Resolution is too high with this monitor....again.
I'm going to say this now. This is Freesync / G-sync ABUSE straight up! Why? G-sync and Freesync were designed as cosmetic fail-safes for the occasional frame rate drop. Most gamers know you never want a low frame rate, but G-sync is there to just make it look good when it happens. This monitor ASSUMES 40fps and depends on Freesync to make you think otherwise. It's an abuse of the technology and a feeble gaming experience. Yes I think very feeeeeble. Resolution too high.

If you can afford a $1000 monitor I'd bet you can afford a couple of top-of-the-line graphics cards to run in tandem. Using the Acer monitor in this review as an example on the FreeSync side, I figure a couple of Fury or Nano cards or a future dual-GPU Fury would do the trick.
 
Status
Not open for further replies.