To my knowledge, the only thing that changes when you change the DLSS quality settings is the pre-DLSS rendering resolution (feel free to correct me if I'm wrong). Presuming that you set your resolution and DLSS settings such that the pre-DLSS rendering resolution is the same in both cases, if the target resolution that you upscale to is higher, the GPU will need to do more AI work to finalize the image. In theory then, you might be able to get a slightly higher frame rate if you use a lower resolution and set DLSS to quality. In reality though, you'd need to test it to know for sure. It's very possible that they have pipelined DLSS upscaling so that it runs in parallell with other rendering since it uses a completely different part of the GPU, in which case the target resolution might not make a difference at all. But even if it it doesn't run in parallell, I can't imagine the difference in performance would be great enough to warrant the quality loss of scaling up to a non-native resolution. I also feel like scaling up to a non-native resolution in a game sort of defeats the purpose of upscaling in the first place, so personally I'd choose native resolution with DLSS set to performance any day of the week, but that's just my opinion.