Benchmarked: How Well Does Watch Dogs Run On your PC?

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


Yeah, the stock A10 coolers are notoriously ill-equipped to handle an overclocked APU.

Invest the $20 into a Rosewill RCX-ZAIO. It will change your life:

http://www.newegg.com/Product/Product.aspx?Item=N82E16835200056&cm_re=rcx_cooler-_-35-200-056-_-Product

If you can spare $5 more, the Hyper TX3 is a better budget option:

http://www.newegg.com/Product/Product.aspx?Item=N82E16835103064&cm_re=hyper_tx3-_-35-103-064-_-Product



 


Ironically, Im using Hyper 212x, and it doesnt do its job, my poor A10 🙁
 


Make sure that bad boy is seated properly, something sounds off. A Hyper212x should be overkill for an A10.
 


Yep, this badass mofo is seated like a baws inside my glorious pc master race rig, in the bios, it showed around 40-50 C', but if I open HWmonitor, it showed about 70 C' :ouch: (I cant post screenshot idk why)
 


You need to use AMD Overdrive to monitor AMD CPU temperatures and not third party softwares(AMD gives them incorrect values). And how much is your room temperature and temperatures when the cabinet is open
 


I dont have AMD overdrive :/ (I use 750 ti) and I dont even know my room temperature
 


You can install AMD overdrive to monitor your legitimate APU temps
 
Man game optimization is way out of whack these days. I'm sitting on a 4-5 year old build running the game great and the new hardware is struggling. Personally I think it's a driver issue with nVidia when it comes to the Titan. My friend has a Titan and my old ass GTX480 outperforms him in a lot of games for some reason. Either that, or the Titans are prepped for next-gen (PC, not console) games and they didn't consider current gen, which would be sorta silly. Either way, I don't see why such a powerful piece of hardware is being held back at all, unless something isn't going right that they aren't telling us.
 


I'm curious what kind of benchmark run they invented. Ours was a surprisingly repeatable, consistent 90-second run. I wouldn't have done this article unless I found a consistent method.

Is there a youtube video of the benchmark run you've created?
 
The game relies a lot of having at least 4 cores and having very good single threaded performance (high IPC)
There are periodic things that it does where a previously idle thread that is active only when uplay in launching the game, and periodically during gameplay, the thread will become active.

If you use a program like process explorer to delve further into the threads that it is running, during normal use, you will see the mane graphics thread using the most overall CPU time and almost completely occupying a single core. but during a lag spoke, a previously idle thread that is mainly only active when uplay launches the game, and periodically during the lag spikes. Normally there are a few MSVCR100.dll threads running and there will be some idle ones, but at seemingly random times, an idle one will go active, and use the same core that the nvwgf2um thread is using, and then the nvwgf2um process drops to a very low usage, causing the GPU usage to also drop and the game to run at low frame rates until that other MSVCR100 thread quiets down. I am not sure what exactly that thread is doing, but the alice madness returns game had the same issue and in that case, it was caused by the DRM performing memory checks to detect certain No-CD memory patchers (which were popular back then because it allowed pirated games to run without needing any kind of editing of the game files, they simply believed that the mounted ISO was the original CD or DVD. The lag issue was eventually fixed when the game files were edited to remove the DRM check, and eventually the company released an official patch which removed that specific check.?

High clock speeds paired with really good IPC minimizes the effect of that check
 


Not yet, although its something I'll probably get around to after I get some time-sensitive work off of my plate.

We're exploring the possibility of including this game in our benchmark suite so I'll probably put together a video for our other writers to see how its done.
 


@InvalidError - I don't know why it quoted me in your reply... I didn't say that.
 
4GB-8GB-16GB has zero effect in this game. System RAM rarely affects frame rates, it usually affects load times.


Your inability to run it at 720p on low settings smoothly is really disturbing. Even a Radeon 7770/R7 250X could handle 1080p/low details in our tests, and your 6850 should be on par with that at least. Something seems off.

If the RAM solves your problems I'll be surprised, but please let us know if it does. I would have prescribed a better GPU

OK, new RAM is in ($150 from Amazon for 16GB of CAS8 installed one day after being ordered ($3 1-day shipping upgrade) - a shame you guys don't favor them over Newegg)! I'm enormously pleased to state that the game's performance has improved DRAMATICALLY. I jacked my resolution back up to 1600x900 and put all the settings on medium and I can't detect any hitching, stuttering, or even frame rate dips. I'll probably continue inching details upward until I find a ceiling, but for now suffice it to say that the 6GB RAM requirement is legit.

I am happy about the upgrade, but I do wish that I could've seen some more robust benchmark data before making the plunge. There are also rumblings (including official dev. posts recommending SSDs) that the level and texture streaming can cause stuttering in some cases when using slower drives, so I think it would greatly benefit readers for benchmarks to be made from slower, mechanical hard drives. There are plenty of SSD owners reluctant to copy every 20GB game install over. Or you could just rename the piece "Is Your Three Month Old Graphics Card Enough for Watch Dogs?"
 

If you start writing a reply but change your mind then decide to reply to something else, the board often re-quotes from all the previous replies that were never submitted on top of the new message being replied to. Since replies are identified by a mess of numbers, it is quite easy to remove the wrong quotes while cleaning that junk.

That's usually how I end up with names getting mixed up in quotes.
 


No, "Coming out" = not quite happened yet = Living In The Present
#LivingInOnePossibleFuture


Put simply:

$190
FX-8350
OK game performance
Great app performance

$190
Core i5-4460
Great Game Performance
Great app performance

So you're saying "spend your money on the FX because someday it might be as good as the Core i5 that costs the same price".

How does the possibility of equaling Core i5 performance in some games make it a better buy than something that is already better in games, guaranteed, and you don't have to wait years for a potential advantage that is hypothetical and in no way proven?

What is your logic path?
 
"Also, the texture detail setting is kept at Medium across our benchmarks, ensuring graphics memory doesn't affect our results too severely."

3GB VRAM or greater cards should be tested at Ultra. Less than 3GB should be tested at High. An explanation as to why should be given.
 


You just quoted the explanation. "ensuring graphics memory doesn't affect our results too severely."

texture memory can be adjusted to accommodate whatever memory your card has. If you choose the right amount, it doesn't affect frame rates.

Nobody with 1GB is going to use a 3GB memory setting, and if you have 4GB of onboard RAM, it doesn't make a performance difference to select ultra textures.

It makes a lot of sense if you consider the point and the implication of the variables.

 
texture memory can be adjusted to accommodate whatever memory your card has. If you choose the right amount, it doesn't affect frame rates.

Nobody with 1GB is going to use a 3GB memory setting, and if you have 4GB of onboard RAM, it doesn't make a performance difference to select ultra textures.

Like the assertion that 4GB->8GB->16GB shouldn't make a difference, this one should be tested. I've seen a lot of people complaining that it does in fact cause a bunch of stuttering. There are a lot of folks with uber graphics boards having to drop textures down to high or even medium to get a smooth experience.
 


I can agree with that. Didn't have the time to do those tests though, and now I'm in the middle of other work, but its something I may revisit. We're looking into using this bench as part of our suite, so I may be able to have a closer look.