Benchmarked: How Well Does Watch Dogs Run On your PC?

cleeve

Illustrious
Watch Dogs is one of the most anticipated games of 2014. So, we're testing it across a range of CPUs and graphics cards. By the end of today's story, you'll know what you need for playable performance. Spoiler: this game is surprisingly demanding!

Benchmarked: How Well Does Watch Dogs Run On your PC? : Read more


******* EDIT: More CPU Results On The Way *******

1- I'm seeing a lot of requests for more CPUs. I totally understand.
Truth be told, I would have liked to add a lot more configs for publication but we were really under the gun here to get something out as quick as possible. I just didn't have the time.

I will try to take FX-6300 and Core i5-3550 benchmarks today and add the results to the charts, though. Stay tuned. :)

[UPDATE May 28] FX-6300 and Core i5-3550 added to CPU benchmark charts [/UPDATE]


2- As for the simulated 780 Ti, I simply don't have a real one here, I asked Nvidia but they ignored the request. But its worlds more relevant than the $1000 Titan I have onhand. I'd rather have an extremely close approximation to the 780 Ti than nothing at all. As far as the VRAM, I was clear in the test setup that I benched the medium texture setting to keep VRAM out as a variable. The game makes it clear which texture setting should be used with the amount of VRAM you have.

3- 4k results would be nice, and we're working on sourcing monitors for our labs. Having said that, the adoption is less than a tenth of one percent. It's not an important inclusion yet, but it'll get there. We're working on it.

- Don



 
Running on my system with ultra and highest settings and fxaa it is pretty steady at 60-70 fps with weird drops randomly almost perfectly to 30 then up to 60 almost like adaptive sync is on, Currently playing it withe the texture at high and hba0+ and smaa and its a pretty rock steady 60fps with vsync still with the random drops.
 

jonnyapps

Honorable
May 12, 2013
111
0
10,690
What speed is that 8350 tested at? Seems silly not to test OC'd as anyone on here with an 8350 will have it at at least 4.6
 

Patrick Tobin

Honorable
Jun 18, 2013
72
0
10,630
Most 780Ti cards come with 3GB of ram, the Titan has 6GB. This is an unfair comparison as the Titan has more than ample VRAM. Get a real 780Ti or do not label it as such. HardOCP just did the same tests and the 290X destroyed the 780 since the FSAA + Ultra textures started causing swapping since it was pushing past 3GB.
 

tomfreak

Distinguished
May 18, 2011
1,334
0
19,280
If u dont have 780ti, 780, just show us stock Titan speed, Why would u rather show us Titan OCed speed than showing Titan stock speed & all that without showing 290X OCed speed? Infact an OCed Titan does not represent a 780Ti, because it has 6GB VRAM. Vram is a big deal in watchdog. So ur Oced titan does not look like 780ti nor a real titan.
 

AndrewJacksonZA

Distinguished
Aug 11, 2011
599
124
19,160
Hi Don

Please could you include tests at 4K resolution, and also please use a real 780Ti and also a 295X2? Can you not ask another lab to do it, or get one shipped to you please?

+1 also on what @Patrick Tobin said.

I can appreciate that you might've spent a lot of time on this review, and we'd really appreciate you doing the final bit of this review. I know that not a lot of gamers currently game at 4K, but I am definitely interested in it please.

Thank you!
 

That_Guy88

Distinguished
Dec 31, 2011
76
0
18,660
I usually use toms as my definitive sit for performance benchmarks but the lack of variety in both cpus and gpus here is really disappointing, especially for this being the first "next gen" game
 
This needs to be redone to see if it actually needs 8 thread capable CPU so I5 and I7 on a lga 1150/1155 should have been included!
Techspot did include those and no difference between I5 and I7 not even lga2011 hexa core!
 

wtfxxxgp

Honorable
Nov 14, 2012
173
0
10,680
You guys are being a bit unnecessary regarding the inclusion of the Titan OC'd to simulate the 780Ti - he simply used what he HAD. I think the choice to use medium textures renders the 6GB VRAM vs 3GB VRAM mostly moot. This was just to give us an indication, why do people have to get so darned technical all the time? You guys should really try to wrap your head around the various scenarios to be tested and the time it takes to be done before you give the Authors grief about "limited this and limited that". The game looks good, thanks for the brief review
 

Empyah

Reputable
Mar 25, 2014
27
0
4,530
First you put the R9 290X(CATA 14.6?) without OC against the Titan with OC,
and then the FX-8350 against a freaking i7-3960X and NO OTHER intel CPU. [edited for language]
For freak sakes i am really trying to follow you as a serios tech-site without bias,
please do not make it any freaking harder for me.
 
I would have liked to have seen more CPUs tested, in particular three that are widely discussed and recommended in the forums, the i5-4670K (or i5-3570K), FX-6300, and 760K.
I hope there is a followup article, focusing on some specific details. These include VRAM limitations, and more tweaking to see which settings changes most affect not only raw FPS but also smoothness. It looks like some settings lead to a very distracting experience, and it would be nice to know what those are.
Edit: Thanks, Don, for adding the FX-6300 and i5-3550; those are useful numbers to have. Here is one title where the FX clearly beats the i3, so core count must matter.

 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310
Take a look at the links the OP gave with 780 vs. 290x. 290x lost. Not sure what all the whining is about. 290x has more ram than a 780 right? Who cares above this res when only 2% are using over 1080p?

Claiming something wins where 98% of us NEVER play is ridiculous. You want to know who wins in 98% of users cases. Those fps are too low for me anyway, as barely breaking 30fps min is not enough. You will see dips even on AMD while playing. They're only showing a snapshot here. They dropped textures to high at hardocp (the 2nd test) and NV won. So yeah if you want to push things to where we probably wouldn't enjoy it, AMD wins. Yay. But if you play at 1080p, the links above show NV winning. I think FAR more people are worried about 1080p. Having said that, this game would laugh at my PC...ROFL.
 

InvalidError

Titan
Moderator
Interesting how Don recommends a minimum of i5 or FX6300 but did not include those in the CPU scaling benchmarks. There should be a LGA115x i7 in there too for a smooth progression from 2C4T to 6C12T - the Extreme CPU has over 3X the i3's raw performance but only manages twice the score; it would have been interesting to see a smooth progression on how much benefit it gets out of extra threads vs extra cores.
 
I must say, I much prefer Tom's video game reviews to what HardOCP does. As much as I enjoy HardOCP's PSU reviews and believe they are well-done, their video game reviews seem devoted strictly to [near] top of the line hardware running at UltraMaxOhWOW! settings that are absolutely irrelevant to the average gamer. I want to see a lot more data points than that. Yes, more would be nice, but at least here we do get enough data for some interpolation/extrapolation to alternate hardware.
 

neon neophyte

Splendid
BANNED
how are 4 core intel processors, both with and without hyperthreading NOT on the cpu benchmarks. it has been claimed that hyperthreading is required for ultra but this wasnt even tested? for all we know the high end intel processor you did test performs no better than an i5.

considering this game is cpu bound, how is there not a more comprehensive cpu benchmarking being done? what a waste.
 
I'm on AMD, but there are a LOT of gaming rigs and gamers out there using i5 3570k/4670k and i7 3770k/4770k. I was surprised not to see these tested. That would have been some very useful information for a lot of people. :(

Is there any way the benchmark charts could be updated to show results for these parts?