Acer XR341CK 34-Inch Curved FreeSync Monitor Review

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

DoDidDont

Distinguished
May 27, 2009
81
0
18,640
Seriously considering two of these for work/play, but holding off as the Asus ROG PG384Q 34" curved monitor looks more promising, G-sync, IPS, and 3440x1440 @ 100Hz. All come down to when Asus will release it, and how long I can wait...
What GPU are you running? if you have AMD get this, if you have Nvidia get the ROG...but wait for however long it will take.

I am using 4x GTX Titans. These are for production rendering in 3ds max using iray, so need the extra vram. Sometimes I am using three apps at once, 3ds max, mudbox and photoshop, or a DWG viewer, so three 34 inch curved monitors will greatly improve workspace.

I wont be gaming in surround, so maybe I will just by one Asus ROG for gaming and two cheaper Dell U3415W monitors.

Max/iray will detect any monitor plugged into another GPU as "Used by Windows" which affects rendering performance, and performance of duplicate apps, when rendering with only three cards and one left to run the OS so I can continue working. Work is a much bigger priority for me than gaming in surround.

The reason I would choose the Rog over the acer, is because of the improved G-sync module and the 100hz refresh rate. 4x GTX Titans can easily keep this rate at ultra settings.
 

Tanquen

Distinguished
Oct 20, 2008
259
9
18,785
I guess what I don't get exactly, is why are they making so many Free Sync monitors? Aren't the majority of gamer's with Dedicated GPU's using Nvidia? It's just kind of frustrating... can't they find a way to make the monitor support both? It seems like the technology should be fairly similar...
Or better yet maybe Nvidia wouldnt be so stingy and only use their g-sync scalers in their approved monitors...Freesync is an open standard and G-sync is not. Nvidia could use Freesync if they allowed it in their drivers..But at the end of the day ill get called an AMD fanboy because I want everyone to not have to split up monitors, and games because Nvidia develops a bunch of proprietary game features and monitors...

Agreed. Nvidia needs to stop trying to make the PC their own console or thiefdom and ruining PC gaming. They want to make a console, go for it. They make video cards and should compete by making the fastest DX or OpenGL or whatever video card not by fixing benchmarks, buying into game devs to hobble performance on their competitor’s hardware and locking display features to their own hardware. People buy PCs for a reason and it’s not to get locked into a console brand with games and hardware that only work with that brand.
 
GSync vs Freesync:
To the comments above... you simply can NOT do the same things with FreeSync as you can with GSync. Not now, and this also extends into the future such as trying to integrate light strobing simultaneous with asynchronous mode.

As NVidia said, the only way was to redesign the scaler which is now the GSync module.

FREESYNC is based on an existing standard which doesn't appear to have the flexibility to do what GSYNC can. It looks similar and it is, but it's not exactly the same. You have to delve into how the GSYNC lookaside buffer works and what happens below the minimum (currently 30Hz) to understand the differences.

Sure a single standard would be nice however NVidia had to invest a LOT of money into this which they would not have done with no reason (sell GPU's). AMD certainly wasn't in a position to spend any money so trash NVidia all you want but they put forward what is IMO the absolute BEST thing gaming has ever seen with GSYNC.

Other:
And yes, before it comes up GSYNC laptops do not have a GSYNC module so people point and scream "look we don't need one for desktop monitors either!". Yes, we do. Laptops do not have a scaler because the manufacturer knows the exact specifics of the panel so can drive it directly from the GPU.

That's because a laptop screen is attached to the GPU directly. A desktop monitor can be attached to lots of different GPU's so to do it right you need an integrated scaler or you get issues. Thus, you need to replace that scaler for something radical like asynchronous mode (done right that is).
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310
^^bingo....Freesync is free because it's cheap to be less than optimal ;) It's cheaper to NOT fix all the issues the tech is supposed to address. Nvidia is DOING their job by addressing as much as they can to fix gamer's issues. By AMD's own admission they hope to "better" control the future selections of components to give a better user experience. The question is, will that even help? Until we get something that shows no issues that have been found in many reviews, the jury is still out and Gsync wins. $100-200 means nothing to me over 7yrs I own a monitor if the experience is REALLY better (and it's proven in this case).

Maybe AMD should stop spending on CONSOLES, so they have R&D for stuff like PC drivers etc? :) Oh wait, they already did consoles. Hence the 82% Nvidia market share while AMD plummeted to under 20%.

The whiners are complaining because Nvidia is ACTIVELY seeking to woo PC gamers by throwing money at them with R&D, while AMD ignored us for a few years which is now showing up as bad drivers (took 8 months for a new driver since Dec2014!), 2nd hand gsync solution, behind in gpus, slapping on HBM (blue crystals in Intel speak) to try to make up for a slacker gpu etc. If the console money they spent to R&D PS4/Xbox1 chips would have been spent on GPU/CPU ~4yrs ago, we wouldn't be talking about AMD being dominated (probably) today on both cpu/gpu fronts. They wouldn't have needed HBM (as NV shows) and there wouldn't have been a shortage of the new gpus costing them yet another quarter of sales (more?).

I really hope ZEN is good, as it's looking more and more like NV will win my next gpu (which means a gsync monitor sold to me too). I already upgraded one PC with Intel (devils canyon), but hope to pass that to my dad and buy myself a ZEN assuming at least as good as Intel's at that moment. If it loses, sorry AMD, I'll go Intel AGAIN. You need to win, or I have ZERO love for your chip. Inexcusable to design from ground up for cpu and TIE, when you have 2x the die space to dedicate to DOMINANCE when stripping GPU out (IE, half of Intel's die is GPU!). It sounds like they aimed at where they thought Intel would be (TIE) when releasing this ZEN, but that is a mistake. They should have made the die the EXACT size of Intel's with cpu+GPU, and dominated with pricing power. Instead, they aimed for a tie (I hope I'm wrong), which Intel will just price to death for a year or two until they put out a winner again. If they put out a die anywhere under 1.5x Intel cpu side management should be fired PRONTO.

AMD is their own problem these days. I blame management, not the engineers. Management is making stupid decisions. If they made a die size 1/2 the size of Intel's Devil's canyon they deserve to lose yet again (meaning basically the size of JUST the cpu side). There is no reason to go small, when you could STILL use less material (1.5-2x cpu side of i7) and WIN decisively. I'd like to buy AMD stuff, but they just keep forcing me not to. If this cpu ends up in a tie, I'll take the watt winner then, no matter which side it is. AMD should force me to buy them by winning perf and matching watts of Intel's whole cpu/apu. IE, 80w PURE cpu. That would force Intel to bolt two of their cpu dies together (8 core) and end up like AMD now at 125w 8 core (FX 8350 now) etc. Intel 8 core would win in VERY few situations for a few years until Intel could R&D a REAL solution at 4 cores again. AMD could price all chips on TOP of Intel instead of Intel being able to price down a tie. Even slapping two cpu dies together would take Intel a while (and again, it would be 125w then and only win in stuff like ripping). They would lose for a year, slap two together (without gpus) and have a real solution 3yrs later giving AMD 3yrs (like last AMD victory) to price high and profit. Unlike last time, when they were limited to ~20% of the market due to production limits, AMD could really do some damage this time with a WINNER and larger die. Even 1.5x Intel cpu side saves on material and should be a winner, but it sounds like they went with basically the size of Intel's cpu side only which is STUPID. The material cost is stupidly cheap, if you can win for the same amount or not much less.

I hope rumors are wrong, and AMD really went BIG on die size. Intel would have margin problems if AMD went big and forces price cuts while already losing 4B+ on mobile. They are only able to do that due to margins on desktop/server stuff. I would spend $350+ on AMD if they decisively beat Intel's chip at that time next year. If they pocketed a Billion+ for the 3yrs it took Intel to engineer a large CPU only core, they could pay off their debt completely (200mil lost yearly on interest on that debt!). Management should be fired if they told cpu engineers "do whatever you want from ground up, but make sure it's small, like just the size of Intel's CPU side"...ROFL. This would be killing the whole point of letting engineers do what they wanted. The engineers were more than capable of making a winner (keller, paparmaster etc), the only question is did they put a small die size limit on those engineers? So again, Nvidia/Intel are not AMD's problem. Employees are not the problem at AMD either. Management is the problem. Dirk Meyer said in 2011 they needed a KING and they fired him for it...LOL. I hope they made a KING with ZEN.
 

Tanquen

Distinguished
Oct 20, 2008
259
9
18,785
“The whiners are complaining because Nvidia is ACTIVELY seeking to woo PC gamers by throwing money at them with R&D…”

Good grief, be real for a second. No one should care about the brand name and I don’t care what company’s name is behind a particular behavior. All the company’s lie and that sucks and should not be the case. “Oh, our new video card is 5000% more powerful than any card ever before.” Then you read the reviews and it averages a few more frames in most games or some such nonsense. What Nvidia has been doing is wrong and that’s it. If another company is doing then they are wrong too. How is misrepresenting performance, actively making games run poorly and locking out a games features on your competitor’s hardware along with locking consumers into displays that have features that only work with Nvidia hardware, wooing PC gamers? When other GPUs could and can do those same the cool physics and graphic effects and use adaptive sync. Nvidia is just encapsulating the features and effects so they only work with their hardware. There is no reason there can’t be a physics or a hair effects or whatever standard that is part of DX. Are you not going to be happy until you get a “No Nvidia hardware detected. Press ok to exit.” when running a PC game?
 

eriko

Distinguished
Mar 12, 2008
212
0
18,690
Seriously, I've been hunting all week for a 34" Ultrawide.

But havn't been able to 'pull the trigger'.

The game I play most is FPS-limited to 91fps. Another 120fps I think.

So, therefore I'm holding on for 120fps equivelent of this reviewed monitor.

Had this been Freesync AND 120fps, I'd have bought it without hesitation.

I know we are a ficle lot, but it is my money, and I tend to hold onto monitors for years, so I need to get it just right.
 

gamertaboo

Distinguished
Sep 23, 2015
65
2
18,665
If they pocketed a Billion+ for the 3yrs it took Intel to engineer a large CPU only core, they could pay off their debt completely (200mil lost yearly on interest on that debt!). Management should be fired if they told cpu engineers "do whatever you want from ground up, but make sure it's small, like just the size of Intel's CPU side"...ROFL. This would be killing the whole point of letting engineers do what they wanted. The engineers were more than capable of making a winner (keller, paparmaster etc), the only question is did they put a small die size limit on those engineers? So again, Nvidia/Intel are not AMD's problem. Employees are not the problem at AMD either. Management is the problem. Dirk Meyer said in 2011 they needed a KING and they fired him for it...LOL. I hope they made a KING with ZEN.

I'm not trying to argue with you, but you do know that Intel makes the 5820k and the 5930k which are 6 cores /12 threads, as well as the 5960x, which is 8 cores / 16 threads, none of which have the integrated graphics and all of which destroy AMD.....right?
 

ozicom

Distinguished
Jun 21, 2012
51
0
18,640
I think Acer won't stay on monitor business for a long time. Because they stated what HDMI revision their product have. Actually they don't have to mention the revision or even port type. They can just say "our product have a bunch of connectors on backside, We're sure one will fit you" and we'll buy them without any question. Well that's a joke of course.
I'd like to congrat Acer for this product but i'd like to congrat them because they mention what HDMI revision their product has. Actually at 1440p it doesn't matter but they prefer to mention. I wish all other manufacturers (i.e. Sony, Samsung, LG, Panasonic...) will find out that stating the revision make customers happy.
Thank you
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


http://www.tomshardware.com/reviews/intel-core-i7-5960x-haswell-e-cpu,3918-6.html
You do realize they don't perform much better in any game compared to 4790 right? The problem with 6-8 is they use more power, and run slower, so unless you have something that REALLY uses all 6 or 8 (or 12/16 HT) you don't do much more than 4.

http://www.tomshardware.com/reviews/intel-core-i7-5960x-haswell-e-cpu,3918-12.html
Thermals and watts through the roof at 4.5+ghz, while my 4790 can do that at 68w or so (gpu off, but whole cpu package otherwise). No fan noise, low heat on mine too and I'm not even pushing things. Tough to OC 6 or 8 cores too as shown.
"Our findings are summarized in the graph below, which primarily shows one thing: overclocking Intel's Core i7-5960X up to 4 GHz isn’t a problem. Between 4 and 4.5 GHz, power consumption and thermals rise much faster though."

The 5960 is pushing 2x my cpu package at merely 4ghz and would easily be smoked in most games by my chip at 4.8 etc. More cores is only a great thing in PRO apps etc that can REALLY use it. Above it actually loses some games (take thief for example vs. 4790). This is why I hope AMD went BIG and FAST quad. Intel's answer in your mind is an expensive system with 6 or 8 cores at high watts (we're talking haswell E here, and all that goes with it, vs. cheapo AMD AM4 boards etc). Intel can't just flop haswell-E into Z97's with DDR3 or something. The answer would take time if AMD goes about the size of Intel's CPU+GPU on a then current quad at 14nm. There is no good and QUICK answer to a HUGE AMD quad. We might see a return to 1999-2001 until Intel could fix it, but in that time this go around, AMD would not be limited by production of about 20% of the market (not with TSMC/GF and possibly samsung who shares with GF). This is AMD's last chance to make a dent and pay off some debt to get back to being fundamentally sound as a company.

The same logic of less cores that do more work (due to apps, clock speeds, IPC etc) works in phones now, which is why a dual or tri core can easily beat quad/octo's (see apple for years). Until devs catch up to your massive core counts, they're pretty useless for most stuff. This is why AMD needs a BIG QUAD clocked as high as they can at <100w. Enthusiasts will pay extra for a FAST quad (high IPC, which AMD claims up 40%, and high clock, which yet to be seen). I'd easily pay a $50 premium to AMD if it WINS like back in ~1999-2001 IIRC vs Intel (or like Intel vs. AMD now...LOL). It can be done considering the people that worked on the chip. It's just a matter of size and using it to WIN not TIE ;) That said, my guess is bad management told them "about the same size as Intel cpu side please"...ROFL. That means tie, maybe win this or that, but basically, I'll buy Intel again probably and no profits for AMD as Intel will just price down if it is anywhere near even with their chip. No win=no pricing power=no margins=no profits. Intel can answer (they have money), but it will be the wrong answer for a while just like last time AMD wiped the floor with them in most stuff for 3yrs.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310
Checking apps in that article you can see some stuff there loses too (Sony Vegas for example, itunes). Again, 6-8 core would not be the "correct" answer to a large AMD Zen CPU die (assuming it's QUAD). The general audience would be pretty happy.
 
Status
Not open for further replies.