I may have to carry a PC downstairs -- or carry a monitor upstairs -- to do UW testing. I've got one, but not in my office. Usually, UW is about 15% lower performance than the equivalent WS resolution (give or take 5%).
The idea that a game must do 60 fps at 1080p medium or high or whatever on whatever hardware people arbitrarily define is ludicrous. As is the idea that Nvidia is intentionally trying to reduce performance on older GPUs. I've looked into that before, and never found a clear instance where a game's performance dropped on older generation hardware. That doesn't mean new games won't perform worse relative to old games, but that's because the technology marches onward.
Turing has concurrent FP and INT support, which can be a big deal. In fact, it is a big deal in a lot of games. Older games might only improve by 10-15 percent thanks to having concurrent FP + INT, but some games run up to 35 percent faster. The more complex shader code becomes, the more likely the FP+INT support will help. Now add in the fact that this is a DX12 game, and Pascal never was quite as good at DX12 generic code compared to AMD's GPUs, as well as Turing. Turing has architectural updates that help it do better in DX12 mode, in part because DirectX Raytracing requires DX12! (Except for the oddball Crysis Remastered.)
Those two aspects are more than enough reason for Turing generation GPUs to perform better than Pascal generation. Could further code optimizations improve performance on Pascal? Absolutely. But each GPU architecture entails a different sort of code to reach maximum performance, and at some point developers have to draw the line. Could Nvidia have tried to help CDPR get Pascal GPUs to work better? Yes. Maybe they even did that, but the mix of shaders and graphics effects is just too much for the older GPUs. It happened with the GTX 900 series, and the GTX 700 series, and so on.