Benchmarked: How Well Does Watch Dogs Run On your PC?

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

InvalidError

Titan
Moderator

Since your chart clearly shows all the top CPUs GPU-bound at 82fps, that severely undermines its value as a CPU benchmark. It is quite likely the 6350, 8350 and i3 results were at least partly GPU-bound as well.

And charts cannot be compared directly unless they cover extremely similar sets of scenes with identical game, driver and hardware configurations, which is practically never the case - your chart maxes out at 82fps while max's chart maxes out at 60fps so there clearly is a major difference in scene and/or settings there.
 

boytitan2

Honorable
Oct 16, 2012
96
0
10,630
Just stop labeling your titan as a 780ti this is not the first article I have seen it in. For one it kills some of this sties credibility and for 2 you guys know you shouldn't be doing this. The titan and 780ti have different strengths and weaknesses. I hope to never see this we O.C. a titan so now its a 780 ti again.
 


Not really. At Ultra preset, R9 270 with 2GB has 33fps min and 38fps avg while R9 280 with 3GB has 31fps min and 40fps avg.
High-1920-FR.png

I don't see any benefit, do you?
 
G

Guest

Guest


The chart you've posted was from a rather poorly done benchmark by Techspot, where they failed to mention the expressions "GPU bottleneck" or "CPU-bound performance", which is obviously the case in the chart in question. This is further emphasized in the following chart from the same page:
CPU_02.png

So, with all due respect, his chart offers a much more realistic performance comparison than yours.

 

maxalge

Champion
Ambassador


Watch_Dogs-CPU-Benchmarks-pcgh.png


"We settled on the start of the fourth mission (titled "Singapore") which begins on the US vessel Valkyrie as the team walks to an inflatable rib where they have a brief discussion and then jump in before being lowered down. Although the test takes place in the Valkyrie's launch bay, "

I guess on your chart they forgot a cpu bench needs for the cpu to be actually benched.

No gameplay at all in that scene. Which explains those pathetic bunched numbers.


I tend to trust pclabs, they use actual benchmarking.

 

Zepid

Honorable
Jan 14, 2014
80
0
10,630
I have a 780 Ti and my results fall right in line with the Titan ones. However I'm running a 3770k.

The performance isn't the frustrating part about this game, it is the fact that it BSODs on boot flow or "randomly" in game. Meanwhile my system/GPU can pass a 6 hour Furmark stress test and a day worth of PRIME without crashing.
 

cleeve

Illustrious


No, that's not the case at all. You *are* getting the nuanced evaluation, I think the problem is that you may be pre-disposed to believe otherwise.

I'm saying, clearly and definitely, than the FX-8350 will perform on par with a Core i3 (and usually slightly below) in the vast majority of games which *are* processor bound. I consider GPU bound results somewhat useless for comparing CPUs, because they only show the margin of error, really. A 2 FPS advantage at 120 FPS isn't a 'win', its the margin of error.

I say this because I've written our game performance analysis articles for almost 10 years now, and Watch Dogs is an extremely rare anomaly. Games tend to favor IPC over multithreading, and Intel's IPC is significantly superior to AMDs.

Multithreading may be more important over the next few years, thanks to the new consoles that have to leverage 8 of AMD's Jaguar cores at a relatively low clock, so devs are getting better at it. Watch Dogs may be the harbinger of that. But until its the rule and not the rare exception, we obviously can't base our recommendations on a single game.
 

cleeve

Illustrious


If you understood how GPUs work, you would know that your statement is untrue.

Our OC'd titan has *identical* memory bandwidth compared to the 780 Ti, over the identical memory interface, and with the GPU overclocked appropriately it has practically identical performance.

When it comes to gamer the functional difference is the size of the frame buffer, which is irrelevant if you're not running ultra-high resolutions or high levels of MSAA, which we didn't for this article. You can buy a 6GB 780 Ti soon, by the way.

The significant difference between Titan and the 780 Ti is compute, and we didn't test that here. Bottom line, its a meaningful comparison and I'm glad I included it for people who understand its worth. Of course you have the option to ignore those results on the chart.

 

cleeve

Illustrious


Nope. This game decouples the video memory setting from the detail setting.

You can run the ultra preset at medium textures, which we did across our benchmarks to remove texture memory size as a variable. You can set the texture memory to whatever your card can handle and it won't affect performance as long as you don't overreach.

 

cleeve

Illustrious


- The Titan has the exact same memory interface as the 780 Ti. The only difference is speed, as you say: 6000 MHz vs 7000 MHz. We overclocked the Titan to 7000 MHz VRAM. Absolutely, completely identical memory bandwidth.

- The 780 Ti has 2880 CUDA cores, Titan has 2688 (about 6.6% more). To compensate we raised the Titan clock the same percentage over the reference 780 Ti clock (about 6.6% higher). The result is practically identical processing throughput per unit of time.

Its close to perfect.

Yes, you most definitely can do this in applicable cases if you have an intimate understanding of the products, their relationship, and what you're doing. Its a method we've proven in the lab time and time again by comparing slightly different models, and it works in appropriate cases if you know exactly what you're doing and test appropriately, which we did.

Once again, we've made it clear its an overclocked Titan, we never hid that from you. If you don't agree with the principle you certainly have the option to consider it an overclocked titan result and nothing more, but for those of us who understand its value it's a great data point.
 

InvalidError

Titan
Moderator

The amount of memory the game thinks it is using does not necessarily match the amount of memory the drivers are actually allocating: if I was writing drivers to optimize GPU performance, I would duplicate the most frequently accessed textures across multiple memory channels to maximize availability whenever I have spare memory so if a game requests 1.5GB for texture memory and I happen to have 6GB available across three channels, I would end up duplicating most textures at least once so the back-end texture VRAM usage may actually be 3-4GB depending on how much RAM is reserved for other stuff.

I would be very surprised if neither AMD or Nvidia did this since it is a very old trick.
 

cleeve

Illustrious


You will be fine at the low detail preset.

At that res you might even be able to bump it up to the medium preset and still keep the minimum FPS above 30. Medium looks much nicer than low.
 
Aug 23, 2013
19
0
10,510

Thanks! :D
but, how many fps will i get at high settings (not ultra)?
 

Achoo22

Distinguished
Aug 23, 2011
350
2
18,780
Will my pc handle this at 900p (1440x900)? and at how many fps on what settings?
Intel Core i5 3470
8GB RAM
GTX 650 1GB GDDR5

I have an i5 2500 w/ 4GB RAM and a HD6850, and the game runs like an absolute dog for me. Even dropping down from my normal resolution of 1600*900 to 1280*720 with everything set as low as it will go leaves chase scenes feeling awful. Will report back tomorrow after upgrading to 16GB RAM, if I see major improvement (though I'm honestly not really expecting it).
 
Aug 23, 2013
19
0
10,510

well you need everything new (just for the cpu, gpu, motherboard, and ram), you know. but if you can do that (specs) with 16gb ram, tell me :D
 

Achoo22

Distinguished
Aug 23, 2011
350
2
18,780


4GB-8GB-16GB has zero effect in this game. System RAM rarely affects frame rates, it usually affects load times.

If you think you need more info, tell me your specs and monitor and I can tell you what you need to run it smoothly at max settings.

THat is a very generous offer, thank you. I'm running a HD6850 on an i5 2500. Upgrading to the beta drivers you mentioned in the article and dropping everything to lowest possible settings at 720p while shutting down every nonessential background task made the game playable, but I'm still seeing severe hitching. Especially while driving, the action periodically grinds almost to a complete standstill. I notice that when this happens, my HD activity light is glowing like the sun and firing up resource monitor does show a ton of hard paging faults - I decided to bite the bullet and pick up more RAM to see if it helps. I do wish I would've seen your message before placing my order, but maybe the extra headroom is still a good idea - I originally planned to hold off on any upgrades at all until rebuilding my next system.
 

cleeve

Illustrious


Probably unplayable. Keep in mind, though, that this game looks OK on low, medium is the biggest step up in graphics.

High and Ultra are relatively subtle in comparison, but they use a lot more processing power.

 
Aug 23, 2013
19
0
10,510

okay. thanks! gonna buy this game after school year 2014-2015 (i'm only 15 :p )
 

Achoo22

Distinguished
Aug 23, 2011
350
2
18,780

well you need everything new (just for the cpu, gpu, motherboard, and ram), you know. but if you can do that (specs) with 16gb ram, tell me :D

I considered, too, a full platform upgrade. But my CPU still has life in it, I think, and my 6850 generally performs pretty well on my sub-1080p display. Are you suggesting that after bumping the RAM and perhaps grabbing a 760 or 770 I wouldn't be able to blaze through any game likely to release in the next couple of years?
 

cleeve

Illustrious


1. Well, your CPU is great. An i5-2500 is practically indistinguishable from a new i5-3650 from a gaming perspective in a real-world scenario.

2. Your graphics card is definitely the obvious weak link, but...

3. Your inability to run it at 720p on low settings smoothly is really disturbing. Even a Radeon 7770/R7 250X could handle 1080p/low details in our tests, and your 6850 should be on par with that at least. Something seems off.

If the RAM solves your problems I'll be surprised, but please let us know if it does. I would have prescribed a better GPU, but I never ran the game with less than 8GB of system RAM. So that might do it. If not, I'd say something is wrong with your system, and I'd try a graphics card swap (even if only temporarily to test) to see if there's something not quite right with your radeon.



 
Aug 23, 2013
19
0
10,510

you can still blaze with a 760. it maxes out any games at 1080p (especially sub-1080 like mine also). for future proofing (that will last almost 10 years i guess) would be a 770 unless you upgrade to a new monitor.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


The game prefers NV where 98% of us play. It prefers AMD when you go so far NV's memory is an issue, but then fps isn't fast enough to play really anyway. Then again, NV would probably say their driver is better where we all play and they've been bragging about this driver and dx11 for a while now and that driver is now WHQL. If AMD improves perf in the game and catches them I guess it's the driver. If that never happens I guess the game loves NV (which shouldn't be a surprise they bundle the game with their hardware).

Personally I'd like to see the drivers tested against each other in some games (780ti vs. 290x) and compare it to an older driver from both to see who's really doing their homework in the driver dept. That is more interesting to me than ONE game. We don't get enough driver version tests yearly and it seems to me it's pretty important when they're claiming a lot in a particular rev. If true in this case NV has taken out Mantle's best case scenario (star swarm) with a driver update and they do it claiming a lot of games improved, so why not test that rather than a beta game at launch?
 

Achoo22

Distinguished
Aug 23, 2011
350
2
18,780


1. Well, your CPU is great. An i5-2500 is practically indistinguishable from a new i5-3650 from a gaming perspective in a real-world scenario.

2. Your graphics card is definitely the obvious weak link, but...

3. Your inability to run it at 720p on low settings smoothly is really disturbing. Even a Radeon 7770/R7 250X could handle 1080p/low details in our tests, and your 6850 should be on par with that at least. Something seems off.

If the RAM solves your problems I'll be surprised, but please let us know if it does. I would have prescribed a better GPU, but I never ran the game with less than 8GB of system RAM. So that might do it. If not, I'd say something is wrong with your system, and I'd try a graphics card swap (even if only temporarily to test) to see if there's something not quite right with your radeon.
Thanks for the input. I agree with your assessment, and was pretty shocked when performance increased after dropping the resolution and details so low (6850 isn't great, but it's a cut above a budget card and for sub-1080p it is generally quite sufficient). I'll report back tomorrow after installing more RAM, but the only other thing I can think of would be the difference in streaming from disk - both my swap and the game files reside on a hybrid drive versus the blazing fast SSD you guys advocate.

I'd love to see some input (or better yet, benchmarks) from someone else with a similar system (Sandy Bridge + 6850/6870 + 4GB /can't/ be that unusual a config, can it be?).
 

h2323

Distinguished
Sep 7, 2011
78
0
18,640
Running a rig with: 4670K @ 4.0Ghz, 2 X 270X Toxic, 16GB 2133 Ram.
I can play on ultra (1080p) at like 30 fps and hitting 40s with crossfire enabled -__-. The 14.6 update didn't really help. On monitoring GPU usage it sticks between 50-70% for each gpu, hence scaling is bad.
And DAMN the stuttering is bad. I'm considering waiting for more updates to play this game smoothly, but I can get impatient and play anyway lol.


Because in the majority of modern games, it does not perform anywhere near a Core i5. It usually performs in the neighborhood of the Core i3.

Watch Dogs is an outlier, and the Crysis II link you provided is obviously bottlenecked by the GPU so we're talking about insignificant differences within the margin of error there. In that same chart, the core i5-2500 beat the 8350 by a couple FPS out of about 120. Its limited by the graphics card.

If Watch Dogs were the only game on earth the recommendations would be different, but we have to generalize.
In the future, if more heavily-threaded games arrive, that might change too.


Actually the majority of modern games are now multi-thread, along with NEW software. FX8350 has been remarkably resilient to time, it has improved its standing when included in benchmarks for new hardware as time has gone by. New software and new game are used to bench new hardware, FX has improved over time. For those who have put the time in finding accurate information the fx 8350 is on par with i73570k and can compete with i7 3770k especially when overclocked, it is also a superior multi-tasker. Not having it on the charts with i7 up to 3770 is an opinion that has to do with it not being what people want it to be and not seeing for what it is.