Why would you test VRSS in situations where you are always hitting 100% of your target frame rate ***without VRSS enabled.
If it's being pitched as an optimization, then it shouldn't
hurt framerates. That's just baseline testing. If that had been established,
then it would make sense to see how much it could help, in cases where the baseline configuration was falling short.
However, the next test I'd like to see is what Nvidia recommends they try it on. That should basically show the maximum benefit it can provide. From there, the reader would know how much it can help, how much it can hurt, and that you should use it with care. Or, if it even hurts in the best case, then the take away is simply not to use it (unless/until an improved version is released).
Overhead tests seem to be of minimal value because why would anyone use VRSS if they always hit 90Hz?
Because it's VR, where framerate is king. Therefore, you want the best quality possible, but not at the expense of framerate. The underlying concept actually makes a lot of sense.