Tested: Nvidia’s Variable Rate Supersampling Doesn’t Deliver on its Promises (Yet)

computerguy72

Distinguished
Sep 22, 2011
190
1
18,690
I don't understand what your article is intended to prove. Why would you test VRSS in situations where you are always hitting 100% of your target frame rate ***without VRSS enabled. Doesn't that completely defeat the purpose of the test?

It's as if you are attempting to measure the overhead of VRSS not the potential benefit.

I think test the benefit you would need to:
  1. Set the SS rate much higher OR
  2. Pick titles that have difficulty with a 90hz frame rate OR
  3. Use in-game quality settings that are high enough to significantly drop the baseline frame rate.

Overhead tests seem to be of minimal value because why would anyone use VRSS if they always hit 90Hz?

Also your headline seems misleading since you did not test the primary purpose of VRSS. Perhaps change it to something like "VRSS overhead measured at 15%!" or something. But to say it under performs when you didn't actually test the performance using desirable scenarios does not seem to be a good idea.
 
Jan 26, 2020
1
1
15
I think you may be mistaken about how the tech works. Nvidia has said that it applies a region of msaa on top of the standard setting if you have the headroom. The amount it adds is based on the max setting of the game. E.g. if you turn msaa to 2x, the whole scene will have 2x msaa and a region at the centre will get the games max (let's say 8x msaa) added on top of that. The idea is you get 8x or more quality without the hit of applying to the whole scene. So really you should compare the image quality and frame rate of 2x Msaa plus vrss to 8x msaa.
 
  • Like
Reactions: jakjawagon

Gillerer

Distinguished
Sep 23, 2013
361
81
18,940
You shouldn't lie with your graphs; setting a non-zero origin is bad. Even you were to explicitly draw attention to the fact in the text, many people will just skim the article and look at the graphs.

Difference between 90 and 86 FPS will be hardly noticeable in real life, but your graph makes it look like the performance hit is around -60%!

For the "in-time frames" it's even worse - the graph makes it look like 2/3 of the frames are missed, when in fact it's 1.3%. If that value is considered unacceptable, either bar charts are unsuitable for visualization here, or you should use a "missed frames" graph instead.
 
Last edited:
  • Like
Reactions: jakjawagon

bit_user

Polypheme
Ambassador
You shouldn't lie with your graphs; setting a non-zero origin is bad. Even you were to explicitly draw attention to the fact in the text, many people will just skim the article and look at the graphs.
Good point. Only a few are like this, but I hadn't noticed.

Difference between 90 and 86 FPS will be hardly noticeable in real life,
Depends on whether the HMD is using VRR (which they don't, AFAIK) or compensating for late frames with (I think) techniques like ATW. Otherwise, I'd imagine a late frames could manifest as mild stuttering and potentially contribute to VR sickness.
 

bit_user

Polypheme
Ambassador
Thanks for testing this. I'd certainly like to hear Nvidia's response.

Either @Spratlink is right and the article mis-characterizes VRSS as an optimization, when in fact it adds additional quality, or the article is right that it's not the win it's claimed to be.

Either way, worth following-up.
 

bit_user

Polypheme
Ambassador
Why would you test VRSS in situations where you are always hitting 100% of your target frame rate ***without VRSS enabled.
If it's being pitched as an optimization, then it shouldn't hurt framerates. That's just baseline testing. If that had been established, then it would make sense to see how much it could help, in cases where the baseline configuration was falling short.

However, the next test I'd like to see is what Nvidia recommends they try it on. That should basically show the maximum benefit it can provide. From there, the reader would know how much it can help, how much it can hurt, and that you should use it with care. Or, if it even hurts in the best case, then the take away is simply not to use it (unless/until an improved version is released).

Overhead tests seem to be of minimal value because why would anyone use VRSS if they always hit 90Hz?
Because it's VR, where framerate is king. Therefore, you want the best quality possible, but not at the expense of framerate. The underlying concept actually makes a lot of sense.
 

computerguy72

Distinguished
Sep 22, 2011
190
1
18,690
Because it's VR, where framerate is king. Therefore, you want the best quality possible, but not at the expense of framerate. The underlying concept actually makes a lot of sense.
Once again why would anyone who is hitting 100% of their target framerate run VRSS? The testing TH did is pointless as obviously enabling things like VRSS will have some overhead and TH has proven the obvious.

Show us what it can do when you can't hit your fr target that is specifically the purpose of VRSS that needs testing and would be of some use. The only practical thing this article showed is don't turn on VRSS if you already hit 90Hz. Duh!
 

d0x360

Distinguished
Dec 15, 2016
115
47
18,620
Good point. Only a few are like this, but I hadn't noticed.


Depends on whether the HMD is using VRR (which they don't, AFAIK) or compensating for late frames with (I think) techniques like ATW. Otherwise, I'd imagine a late frames could manifest as mild stuttering and potentially contribute to VR sickness.

It is very obvious when there is even a mild drop in frame rate or frame pacing. Even when that drop is from 90 to 85.

Everything in VR is magnified significantly when it comes to visuals and performance and any deviation is obvious and intrusive.

Now, whether the graphs are misleading or not doesn't change the fact that this technology should NEVER cause any performance issues. If it can't improve image quality in the target area without causing performance problems, even nearly imperceptible ones then it shouldn't be in the driver for public use yet.
 
  • Like
Reactions: bit_user

d0x360

Distinguished
Dec 15, 2016
115
47
18,620
Once again why would anyone who is hitting 100% of their target framerate run VRSS? The testing TH did is pointless as obviously enabling things like VRSS will have some overhead and TH has proven the obvious.

Show us what it can do when you can't hit your fr target that is specifically the purpose of VRSS that needs testing and would be of some use. The only practical thing this article showed is don't turn on VRSS if you already hit 90Hz. Duh!

Wrong. The point of VRSS is to enhance image quality AND stay at a native refresh rate and it clearly doesn't do that. It's completely pointless to get a higher level of anti aliasing which improves image quality, especially in VR, if it's going to cause issues and it clearly does cause issues.
 
  • Like
Reactions: bit_user

bit_user

Polypheme
Ambassador
Once again why would anyone who is hitting 100% of their target framerate run VRSS? The testing TH did is pointless as obviously enabling things like VRSS will have some overhead and TH has proven the obvious.
If the overhead exceeds the benefit, then it's not an optimization. Since it was reportedly billed as an optimization, that would make it DoA.

Show us what it can do when you can't hit your fr target that is specifically the purpose of VRSS
No, the alleged purpose is to utilize spare capacity of the GPU to improve quality, without compromising framerate.

The only practical thing this article showed is don't turn on VRSS if you already hit 90Hz. Duh!
No, that's not all. It showed that if you're not hitting 90 Hz without VRSS, you're definitely not going to hit it with VRSS.