News Watch Dogs Legion Benchmarked: Seriously Demanding

Just seems like a highly bloated engine which was rushed to release to counter program Cyberpunk
 
Seems more like seriously unoptimized.
I really dislike claims like this.

It's an open-world game with hundreds (perhaps thousands) of active vehicles and people running around the city, following schedules. I'm sure there's some behind-the-scenes stuff to keep things from bogging down too much, but the more realistic a simulation becomes, the more demanding it is on the CPU. Visually, there's also a lot going on with a relatively faithful recreation of London. It's not photo-realistic, but all the textures, shadows, lighting, reflections, etc. all tax your GPU -- and that's before enabling ray tracing.

What 'unoptimized' part of the game needs to be fixed? Should Ubisoft cut down the number of AI entities to reduce the CPU load? Drop the geometry levels to make it run better as well? It could, but this isn't the type of game that needs 240 or even 144 fps. It's very playable at anything above around 50 fps, with 60 being more than sufficient. Basically, if you think it's "unoptimized" because it can't run at max details and 150 fps or whatever, that's not the intended design. 4K60 is possible on a variety of cards and settings, just not on midrange stuff. With the next generation of consoles, this is basically going to become the new baseline for graphics and CPU requirements is my expectation.
 
I really dislike claims like this.

It's an open-world game with hundreds (perhaps thousands) of active vehicles and people running around the city, following schedules. I'm sure there's some behind-the-scenes stuff to keep things from bogging down too much, but the more realistic a simulation becomes, the more demanding it is on the CPU. Visually, there's also a lot going on with a relatively faithful recreation of London. It's not photo-realistic, but all the textures, shadows, lighting, reflections, etc. all tax your GPU -- and that's before enabling ray tracing.

What 'unoptimized' part of the game needs to be fixed? Should Ubisoft cut down the number of AI entities to reduce the CPU load? Drop the geometry levels to make it run better as well? It could, but this isn't the type of game that needs 240 or even 144 fps. It's very playable at anything above around 50 fps, with 60 being more than sufficient. Basically, if you think it's "unoptimized" because it can't run at max details and 150 fps or whatever, that's not the intended design. 4K60 is possible on a variety of cards and settings, just not on midrange stuff. With the next generation of consoles, this is basically going to become the new baseline for graphics and CPU requirements is my expectation.

I totally agree. Although I'd rather 60 over 50fps 😛.
 
I totally agree. Although I'd rather 60 over 50fps 😛.
My point was that 60+ was the target, rather than some extreme framerate like 200 fps. It's not an FPS, basically, so 60 and above are sufficient. And dropping a few settings (to stay under your GPU's VRAM limit) definitely helps the 2060 and 5600 XT. I just didn't show every possible configuration and result.
 
  • Like
Reactions: Makaveli
I really dislike claims like this.

It's an open-world game with hundreds (perhaps thousands) of active vehicles and people running around the city, following schedules. I'm sure there's some behind-the-scenes stuff to keep things from bogging down too much, but the more realistic a simulation becomes, the more demanding it is on the CPU. Visually, there's also a lot going on with a relatively faithful recreation of London. It's not photo-realistic, but all the textures, shadows, lighting, reflections, etc. all tax your GPU -- and that's before enabling ray tracing.

What 'unoptimized' part of the game needs to be fixed? Should Ubisoft cut down the number of AI entities to reduce the CPU load? Drop the geometry levels to make it run better as well? It could, but this isn't the type of game that needs 240 or even 144 fps. It's very playable at anything above around 50 fps, with 60 being more than sufficient. Basically, if you think it's "unoptimized" because it can't run at max details and 150 fps or whatever, that's not the intended design. 4K60 is possible on a variety of cards and settings, just not on midrange stuff. With the next generation of consoles, this is basically going to become the new baseline for graphics and CPU requirements is my expectation.
Problem is, the performance often tanks, while your CPU and GPU are nowhere near fully utilized.
I guess you could argue its buggy instead of poorly optimized.
 
Problem is, the performance often tanks, while your CPU and GPU are nowhere near fully utilized.
I guess you could argue its buggy instead of poorly optimized.
I actually haven't noticed much in the way of major drops in performance, but then I'm running on a relatively high-end setup. There are so many factors that can cause issues. Is it loading data off HDD and stuttering, is it running out of RAM, is it driver cruft, or some other background application, or one of dozens of other possibilities? I fully expect a game like this to struggle on a 4-core CPU, particularly 4-core/4-thread i5 from 7600K and earlier. I realize just throwing hardware at the problem isn't sound advice either, but it's amazing how often that fixes any issues. Major games are always buggy, though, and things will likely improve quite a bit in the coming months. That's the problem with the holiday rush: everyone trying to get games released before Christmas / Black Friday, which inevitably leads to crunch, bugs, and other issues.
 
Any idea on why the game is using between 85-90% of my cpu all the time in the game ? . Here's my config : i9-9900k 3.6GHz, Evga rtx 20800 super xc ultra 8gb, 32gb of RAM at 3000Mhz. Everything is set on high or very high and I'm getting between 70-90 fps. I checked the task manager and it looks like only the game is using the cpu when I'm playing. I am using windows gamebar to check the performance of the cpu,gpu and ram. Also Performance mode is enabled on windows 10. I would really like to play the game but I don't want to risk my cpu. Do you think gamebar readings are incorrect . Also raytracing is turned off and the problem is really not my gpu nor ram but the cpu, its really weird. Its the only game that it happens on. Sorry for my english btw
 
Last edited:
Any idea on why the game is using between 85-90% of my cpu all the time in the game ? . Here's my config : i9-9900k 3.6GHz, Evga rtx 20800 super xc ultra 8gb, 32gb of RAM at 3000Mhz. Everything is set on high or very high and I'm getting between 70-90 fps. I checked the task manager and it looks like only the game is using the cpu when I'm playing. I am using windows gamebar to check the performance of the cpu,gpu and ram. Also Performance mode is enabled on windows 10. I would really like to play the game but I don't want to risk my cpu. Do you think gamebar readings are incorrect . Also raytracing is turned off and the problem is really not my gpu nor ram but the cpu, its really weird. Its the only game that it happens on. Sorry for my english btw
I'd look at what's using the CPU. I just did a quick check, and with 2080 Ti and 9900K running 1440p very high settings, I'm seeing average CPU load of just 45%. CPU temperatures are also 47C, with an AIO cooler, and clockspeed on the CPU is a steady 4.7GHz. If your CPU is clocking down that would increase the load, so check that as well.
 
I'd look at what's using the CPU. I just did a quick check, and with 2080 Ti and 9900K running 1440p very high settings, I'm seeing average CPU load of just 45%. CPU temperatures are also 47C, with an AIO cooler, and clockspeed on the CPU is a steady 4.7GHz. If your CPU is clocking down that would increase the load, so check that as well.
Nothing is using the cpu other than the game. I checked the task manager and only Watch Dogs is using the cpu at max %. It doesn't do that on other games. I tested Destiny 2 and it was using only 20-40% of the cpu and all of the settings were at ultra. Why would the cpu clock down ? And how can I prevent it from doing so, Also I saw that I could set my pc to direct all of its ressources when I am playing the game. I tried it again and my cpu isn't clocking down, its always at 4.5Ghz. When I'm in the menu everything is fine, the temperature of the cpu is fine too. For some reasons it starts when I am in the game, the work load jumps at 100% but the game is the only thing running even when its set at high priority.
 
Last edited:
Nothing is using the cpu other than the game. I checked the task manager and only Watch Dogs is using the cpu at max %. It doesn't do that on other games. I tested Destiny 2 and it was using only 20-40% of the cpu and all of the settings were at ultra. Why would the cpu clock down ? And how can I prevent it from doing so, Also I saw that I could set my pc to direct all of its ressources when I am playing the game.
Downclocking would be either because it's getting too hot, or if you're running with balanced or power saving profile maybe. The best way to figure it out is to run something like HWiNFO64 and log all activity while playing the game. That's what I did to give the above numbers.
 
Downclocking would be either because it's getting too hot, or if you're running with balanced or power saving profile maybe. The best way to figure it out is to run something like HWiNFO64 and log all activity while playing the game. That's what I did to give the above numbers.
I added this part to the last message : I tried it again and my cpu isn't clocking down, its always at 4.5Ghz. When I'm in the menu everything is fine, the temperature of the cpu is fine too. For some reasons it starts when I am in the game, the work load jumps at 100% but the game is the only thing running even when its set at high priority. The game crashes now when I’m using it in windowed mode with Hwinpo64. So I can’t get the readings. I’m sure the game is the problem. I’ve tried like 5 heavy workload games and it’s the only one that does that. I don't understand why it's doing that because when I am in the games menu everything is fine but when I load it up, that's when the problem start appearing.
 
Last edited:
I added this part to the last message : I tried it again and my cpu isn't clocking down, its always at 4.5Ghz. When I'm in the menu everything is fine, the temperature of the cpu is fine too. For some reasons it starts when I am in the game, the work load jumps at 100% but the game is the only thing running even when its set at high priority. The game crashes now when I’m using it in windowed mode with Hwinpo64. So I can’t get the readings. I’m sure the game is the problem. I’ve tried like 5 heavy workload games and it’s the only one that does that. I don't understand why it's doing that because when I am in the games menu everything is fine but when I load it up, that's when the problem start appearing.
I don't understand why it's doing that because when I am in the games menu everything is fine but when I load it up, that's when the problem start appearing. Cpu isn't clocking down, stable 4.5Ghz, all cores are enabled. Is it possible for the game to be running only on 4 cores instead of 8 or something?
 
I don't understand why it's doing that because when I am in the games menu everything is fine but when I load it up, that's when the problem start appearing. Cpu isn't clocking down, stable 4.5Ghz, all cores are enabled. Is it possible for the game to be running only on 4 cores instead of 8 or something?
What are your full system specs, are you running anything overclocked, and have you done a proper clean driver installation? By that last, I mean running the latest version of Display Driver Uninstaller, wiping out all Nvidia GPU drivers from the PC, and then downloading and installing the latest 457.09 drivers. (No, installing those drivers and checking the "clean install" option isn't quite the same -- that can leave behind some bits and pieces that may impact performance.) And besides drivers and CPU load, you need to look at temperatures and clock speeds. Verify the CPU and GPU clocks are running as expected, and that temperatures aren't exceeding 75-80C at most (and hopefully less than 70C for at least a while). It's also possible some other task running on the PC is interfering in some unforeseen way.
 
Demanding games are good. I like seeing all that high-end PC hardware put to use -- especially the CPU for controlling all those NPCs. Any title that tries to advance the capabilities of NCP AI deserves praise. The PC platform desperately needs a new Crysis-like game. Looks like the 10-series cards should also handle Legion well...assuming 1060 or better.

However, I just cannot get into Ubisoft open world games anymore. They all feel exactly the same to me. Yeah, they look different, but the core gameplay loop is identical.
 
I really dislike claims like this.

It's an open-world game with hundreds (perhaps thousands) of active vehicles and people running around the city, following schedules. I'm sure there's some behind-the-scenes stuff to keep things from bogging down too much, but the more realistic a simulation becomes, the more demanding it is on the CPU. Visually, there's also a lot going on with a relatively faithful recreation of London. It's not photo-realistic, but all the textures, shadows, lighting, reflections, etc. all tax your GPU -- and that's before enabling ray tracing.

What 'unoptimized' part of the game needs to be fixed? Should Ubisoft cut down the number of AI entities to reduce the CPU load? Drop the geometry levels to make it run better as well? It could, but this isn't the type of game that needs 240 or even 144 fps. It's very playable at anything above around 50 fps, with 60 being more than sufficient. Basically, if you think it's "unoptimized" because it can't run at max details and 150 fps or whatever, that's not the intended design. 4K60 is possible on a variety of cards and settings, just not on midrange stuff. With the next generation of consoles, this is basically going to become the new baseline for graphics and CPU requirements is my expectation.

Sorry man but this is way off base. The issues with optimizations of 3000 series cards is already widely documented around. Like many journalism outlets you've written an article based off the in game benchmark which is highly unrepresentative of what people are seeing in game.

Driving scenes on 3080's are regularly tanking FPS down to low 30's to 40 fps. GPU utilization is showing that the system is being inefficient. You might want to check some other threads on reddit to see what users are seeing.

View: https://www.reddit.com/r/pcgaming/comments/jk21s8/warning_watch_dogs_legion_currently_has_terrible/


This game is incredibly unoptimized and users are seeing poor performance no matter raytracing settings they are using. It's almost hilarious that nvidia decided to bundle this game with their 3000 series cards considering the unoptimized performance.

On a side note, I recommend you take your reader's comments to heart as opposed to getting defensive about it. Do some research around. People who are reporting poor performance aren't seeing the same with plenty of other open world demanding games and are on 3000 series cards.
 
  • Like
Reactions: jepeman
Sorry man but this is way off base. The issues with optimizations of 3000 series cards is already widely documented around. Like many journalism outlets you've written an article based off the in game benchmark which is highly unrepresentative of what people are seeing in game.

Driving scenes on 3080's are regularly tanking FPS down to low 30's to 40 fps. GPU utilization is showing that the system is being inefficient. You might want to check some other threads on reddit to see what users are seeing.

View: https://www.reddit.com/r/pcgaming/comments/jk21s8/warning_watch_dogs_legion_currently_has_terrible/

This game is incredibly unoptimized and users are seeing poor performance no matter raytracing settings they are using. It's almost hilarious that nvidia decided to bundle this game with their 3000 series cards considering the unoptimized performance.

On a side note, I recommend you take your reader's comments to heart as opposed to getting defensive about it. Do some research around. People who are reporting poor performance aren't seeing the same with plenty of other open world demanding games and are on 3000 series cards.
And yet, on my test PC, the game runs pretty much as expected based on the built-in benchmark. Which is to say, it's "seriously demanding" and will require balancing your choice of settings with your hardware. Yes, driving around the performance is slower on average than using the built-in benchmark. That's because it's an open world game and you have to load in new assets. That means loading not just new objects, but new AI entities for all the things that come into view, which means higher CPU use. At maxed out settings and ray tracing, even the 3090 can dip below 60 fps at times.

The question is, does that make the game unoptimized, or is it just settings that are too taxing to hit high fps? Which aspects specifically need optimization? The AI? The ray tracing? The level of detail? All of those can be turned down to improve performance. The idea that every game should be able to run at maximum settings and 4K on hardware that's available today is a philosophical stand and not one that has anything at all to do with being "optimized" or not. Many games have settings that are well beyond what current PCs can handle. The solution of course is to not use those settings until your PC has been upgraded.

Ubisoft could easily reduce the quality of maximum quality so that nothing beyond the current high settings exists. That would boost performance by around 25%. Is the game now optimized? Nope: the settings are just different. Which is why, as I said in the post you responded to, that I hate blanket claims of poor optimization. It's an easy claim to toss out, and it's basically impossible to prove or disprove, because to prove a game is poorly optimized, you have to actually fix performance so that it looks the same but runs much better, but only the devs can do that.

It's more of an opinion that can't be wrong or right. "Blue is the best color!" "All games should run at over 100 fps with maxed out settings!" "Brownie root beer is the best tasting root beer on the planet!" You're not wrong if you have any of those opinions, but people who disagree are also not wrong.

Making statements of a game being unoptimized is right in the same class as saying I need to take reader comments to heart and not be defensive. What am I defending in the original post? Ubisoft? The article? I said I didn't like blanket claims of 'unoptimized' -- particularly when there's almost no backing data. I've provided data in the article, with explanations of where that data comes from. I took a stance and am debating that stance. Which you didn't try to respond to but instead went on the offensive.

Again: Poor performance at max settings does not inherently equate to a game being unoptimized. It equates to the game being demanding.
 
Sorry man but this is way off base. The issues with optimizations of 3000 series cards is already widely documented around. Like many journalism outlets you've written an article based off the in game benchmark which is highly unrepresentative of what people are seeing in game.

Driving scenes on 3080's are regularly tanking FPS down to low 30's to 40 fps. GPU utilization is showing that the system is being inefficient. You might want to check some other threads on reddit to see what users are seeing.

View: https://www.reddit.com/r/pcgaming/comments/jk21s8/warning_watch_dogs_legion_currently_has_terrible/


This game is incredibly unoptimized and users are seeing poor performance no matter raytracing settings they are using. It's almost hilarious that nvidia decided to bundle this game with their 3000 series cards considering the unoptimized performance.

On a side note, I recommend you take your reader's comments to heart as opposed to getting defensive about it. Do some research around. People who are reporting poor performance aren't seeing the same with plenty of other open world demanding games and are on 3000 series cards.
Looking at that thread some more and seeing people complain about other Ubisoft titles having performance issues made me scratch my head a little. I've put 40+ hours into The Division 2. I've played a lot of Watch Dogs 2 (at least 10+ hours). I've put a lot of time into Ghost Recon: Wildlands (didn't pick up the new one though). If people's expectations are high, then I don't know what to tell them. But if their expectations are 60FPS at 1440p, then I've been able to maintain that and more consistently.

So I don't know, maybe I'm the real Tech Jesus since I never seem to have problems.
 
  • Like
Reactions: JarredWaltonGPU