Most actual "experts", and by that I mean scientifically, medically, scholarly individuals with extensive training in this area, tend to, in general, agree that the ability to detect any changes in smoothness disappear at around 90hz, with some especially sensitive individuals MAYBE able to detect changes in smoothness or flicker up to around 120hz/FPS. So beyond that, it mostly just a question of how many FPS you are actually capable of playing at, whether you can sync to your target frequency, etc., in order to avoid tearing etc.
The bottom line is, with a 144hz display, playing at 144fps, there is no way you could ever detect the difference between that and 170hz/fps. I don't care what anybody says, scientifically it's not possible unless there is something undesirable like tearing, stuttering, etc., going on, and if all else is operating properly, that shouldn't be happening anyhow. If it is, then it's likely not due to any change in monitor frequency but is hardware or settings related.
30 fps? 60 fps? If you've ever debated framerates, the cognitive researchers we spoke to have some complex answers for you.
www.pcgamer.com
The bigger concern, maybe, might be that moving to a 4k panel means (If using the entire screen to game, ie, non-windowed mode) you'll likely have an additional 921,600 pixels to drive compared to dual 1440p displays, and much, much more than that if you were only gaming on one of the 1440p monitors. This might result in a serious hit on performance although Overwatch is obviously potato territory so I'd think you still ought to be capable of doing quite well even at 4k but you may find difficulty coming in near 144fps depending on what kind of settings you generally run with. And then again, you might not. I've never used an RTX 3080 for Overwatch, and I'm not seeing any reputable Overwatch 4k benchmarks using an RTX 3080, so I can't say for sure one way or the other.