Sorry man but this is way off base. The issues with optimizations of 3000 series cards is already widely documented around. Like many journalism outlets you've written an article based off the in game benchmark which is highly unrepresentative of what people are seeing in game.
Driving scenes on 3080's are regularly tanking FPS down to low 30's to 40 fps. GPU utilization is showing that the system is being inefficient. You might want to check some other threads on reddit to see what users are seeing.
View: https://www.reddit.com/r/pcgaming/comments/jk21s8/warning_watch_dogs_legion_currently_has_terrible/
This game is incredibly unoptimized and users are seeing poor performance no matter raytracing settings they are using. It's almost hilarious that nvidia decided to bundle this game with their 3000 series cards considering the unoptimized performance.
On a side note, I recommend you take your reader's comments to heart as opposed to getting defensive about it. Do some research around. People who are reporting poor performance aren't seeing the same with plenty of other open world demanding games and are on 3000 series cards.
And yet, on my test PC, the game runs pretty much as expected based on the built-in benchmark. Which is to say, it's "seriously demanding" and will require balancing your choice of settings with your hardware. Yes, driving around the performance is slower on average than using the built-in benchmark. That's because it's an open world game and you have to load in new assets. That means loading not just new objects, but new AI entities for all the things that come into view, which means higher CPU use. At maxed out settings and ray tracing, even the 3090 can dip below 60 fps at times.
The question is, does that make the game unoptimized, or is it just settings that are too taxing to hit high fps? Which aspects specifically need optimization? The AI? The ray tracing? The level of detail? All of those can be turned down to improve performance. The idea that every game should be able to run at maximum settings and 4K on hardware that's available today is a philosophical stand and not one that has anything at all to do with being "optimized" or not. Many games have settings that are well beyond what current PCs can handle. The solution of course is to not use those settings until your PC has been upgraded.
Ubisoft could easily reduce the quality of maximum quality so that nothing beyond the current high settings exists. That would boost performance by around 25%. Is the game now optimized? Nope: the settings are just different. Which is why, as I said in the post you responded to, that I hate blanket claims of poor optimization. It's an easy claim to toss out, and it's basically impossible to prove or disprove, because to prove a game is poorly optimized, you have to actually fix performance so that it looks the same but runs much better, but only the devs can do that.
It's more of an opinion that can't be wrong or right. "Blue is the best color!" "All games should run at over 100 fps with maxed out settings!" "Brownie root beer is the best tasting root beer on the planet!" You're not wrong if you have any of those opinions, but people who disagree are also not wrong.
Making statements of a game being unoptimized is right in the same class as saying I need to take reader comments to heart and not be defensive. What am I defending in the original post? Ubisoft? The article? I said I didn't like blanket claims of 'unoptimized' -- particularly when there's almost no backing data. I've provided data in the article, with explanations of where that data comes from. I took a stance and am debating that stance. Which you didn't try to respond to but instead went on the offensive.
Again: Poor performance at max settings does not inherently equate to a game being unoptimized. It equates to the game being demanding.