News Nvidia VSR Testing: AI Upscaling and Enhancement for Video

Feb 28, 2023
4
6
15
A 2.9 watt algorithm restricted to only the newest 2 generations of only their card. "Nice" job Nvidia.

Now it's AMD's turn: go make a 0.5 watt screen-space filter that runs on every card since 2004 and no one will adopt.
 
So it only works in a browser?
Yes, as far as I understand it, at least for now, this is purely for browsers. I wouldn't be shocked to see Nvidia offer a final version (or at least, RTX 20-enabled version) for use in other utilities like VLC, but we'll have to wait and see. I'll check with Nvidia as well to see if that's officially in the works.
 

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
759
1,760
ghfhfhfhf.jpg

Cool 240p image.

This might have been interesting in the 90s when people were watching 360p cat videos.

But video content is now 1080p+ and 4k. Video upscaling tech isn't relevant anymore.

And this tech also comes at a cost, it drains battery life. Simply not worth it.
 
Last edited:
Cool 240p image.

This might have been interesting in the 90s when people were watching 360p cat videos.

But video content is now 1080p+ and 4k. Video upscaling tech isn't relevant anymore.

And this tech also comes at a cost, it drains battery life. Simply not worth it.
As noted in the text, it's an exaggerated picture to illustrate the idea — no one uses nearest neighbor interpolation unless it's for effect these days. Also, it's 360p.

There's plenty of 720p stuff still out there, sports streams for example are regularly 1080p or lower. Also, lower bitrate can cause blocking, which the AI algorithm is also trained to at least mostly remove.

And battery life? I guess if you're only watching videos on unplugged laptops, yes, it might drop a bit. Based on the 3050 results and a 50Wh battery, you might go from four hours of video playback to 3.3 hours if you turn on VSR and watch 720p upscaled to 4K. Except then you're on a laptop with a 1080p display most likely, which changes the upscaling requirements — there's no point in upscaling 1080p to 1080p, though perhaps the deblocking and enhancement stuff would still be useful.

Is it a panacea? No. But is it potentially useful if you already have an RTX GPU? Sure.
 
Cool 240p image.

This might have been interesting in the 90s when people were watching 360p cat videos.

But video content is now 1080p+ and 4k. Video upscaling tech isn't relevant anymore.

And this tech also comes at a cost, it drains battery life. Simply not worth it.

Not all online video is 1080p or up. Many videos only go up to 1080p; upscaling to 4K would be useful.

Additionally, if you have a bad internet connection but ample GPU power and no battery life issues (think trying to watch something in a public place using slow public wifi, but you're plugged in or have plenty of battery life) being able to upscale a lower bitrate stream into a higher one almost for free would be worthwhile to most people.

There are also plenty of use cases out there with people wanting to watch older content that is only available at 480p.

If you really have no use case where you find any of this at all beneficial, then just don't use it. Nobody is forcing you. There are a lot of people out there though who have use cases in which this will be helpful.
 

garrett040

Distinguished
Jan 20, 2014
29
30
18,560
Will this become a feature I can simply use for video files on my computer? why the browser requirement? I already bought the overpriced card.
 
Jan 10, 2023
9
7
15
While this is not the best AI upscaling I have seen, it is pretty good when considering that it is done with only an extra 4-8 W in real time. The browser requirement is pretty limiting, though...
 
  • Like
Reactions: bit_user
Will this become a feature I can simply use for video files on my computer? why the browser requirement? I already bought the overpriced card.
As noted above, while the current implementation is specifically for streaming within a browser, I wouldn't be surprised to see support in other apps. I'm waiting for a response from Nvidia on whether this is in the works or not, though.
 
  • Like
Reactions: renz496

evdjj3j

Honorable
Aug 4, 2017
315
325
11,060
"You can still clearly see the differences between the normal upscaling (in Chrome) versus the VSR upscaling. "

I couldn't tell a lot of difference and it would have been really helpful if they were labelled.
 
Feb 28, 2023
1
0
10
How many watts is this thing supposed to use?

Watching twitch on my 4080 with level 4 quality, at 4K with the chat window and channels windows open, so not full 4K video, my gpu board power draw goes from 30W to 80W when enabling this feature as reported by GPU-Z. I do have a three monitor setup.

I see my clock hanging round 1500Mhz boosting to 2500, but with this disabled, it settles down to 500Mhz.
 
How many watts is this thing supposed to use?

Watching twitch on my 4080 with level 4 quality, at 4K with the chat window and channels windows open, so not full 4K video, my gpu board power draw goes from 30W to 80W when enabling this feature as reported by GPU-Z. I do have a three monitor setup.

I see my clock hanging round 1500Mhz boosting to 2500, but with this disabled, it settles down to 500Mhz.
Multiple monitors is known to generally increase power use. I don’t know how much, but it’s entirely possible it’s by 50W with three monitors, at least when “stuff” is happening.

I also noticed that having just a video playing fullscreen was less power than if I had the video playing and alt+tabbed to my power monitoring utility — it jumped by 10-40 watts. So I started the power collection 10 seconds before the test sequence, switched to the fullscreen video, collected data for 80 seconds, then switched back and stopped the data collection. Then I cut out the first and last 10 seconds.

I wouldn’t be surprised if VSR with multiple monitors had some unexpected power increases in some cases right now. Nvidia may or may not be able to fix those with future drivers.

Clocks alone aren’t super important, but I’d be curious if you can see what your GPU load is when VSR is enabled vs. disabled while playing video. Based on what I saw with a single display, it should be in the single digits.
 
Feb 28, 2023
2
3
15
As noted in the text, it's an exaggerated picture to illustrate the idea — no one uses nearest neighbor interpolation unless it's for effect these days. Also, it's 360p.

There's plenty of 720p stuff still out there, sports streams for example are regularly 1080p or lower. Also, lower bitrate can cause blocking, which the AI algorithm is also trained to at least mostly remove.

And battery life? I guess if you're only watching videos on unplugged laptops, yes, it might drop a bit. Based on the 3050 results and a 50Wh battery, you might go from four hours of video playback to 3.3 hours if you turn on VSR and watch 720p upscaled to 4K. Except then you're on a laptop with a 1080p display most likely, which changes the upscaling requirements — there's no point in upscaling 1080p to 1080p, though perhaps the deblocking and enhancement stuff would still be useful.

Is it a panacea? No. But is it potentially useful if you already have an RTX GPU? Sure.
It doesn't even work without being plugged in. My laptop 3070 draws 30 watts vsr off and 140(maxed) vsr on. 720p YouTube video. Idk how you guys are getting single digit wattage increases.
 

in_the_loop

Distinguished
Dec 15, 2007
158
17
18,685
Will this become a feature I can simply use for video files on my computer? why the browser requirement? I already bought the overpriced card.

With video players like MPC-HC you already have very capable upscalers that are probably better than the AI one here. And they can use pixels shaders, ie the graphics card to render the upscaling.
 
Apr 1, 2020
1,445
1,100
7,060
All those images are 4K JPG, with maximum quality — not lossless, but we can't exceed 10MB, so some slight compression was required.

Other sites use Google Drive or another service to get around size restrictions. Hopefully too they will refine it quickly if Google decides to push forward on limiting 1920x1080 to Youtube Premium only.
 
It doesn't even work without being plugged in. My laptop 3070 draws 30 watts vsr off and 140(maxed) vsr on. 720p YouTube video. Idk how you guys are getting single digit wattage increases.
That's with only the one screen? Are you in fullscreen mode on whatever video you're watching, or is it in a window (or in the background)? I would guess based on what I saw that having MSI Afterburner or GPU-Z running on top of the video will cause much higher power use. And what content are you watching — that probably also plays a role. Probably some video sources and video types are going to be more demanding, computationally. Like if you're on YouTube and you watch something that's AV1 encoded, then the GPU will definitely have to do more work just to decode. But trying to test lots of different video formats was definitely beyond the scope of what I wanted to try and do for this article.

But since you're on a laptop, it may mean you're not just measuring GPU power. I have a power testing setup that only measures power that goes to the graphics card — via PCIe slot plus up to three dual-8-pin cables. I am not checking CPU load or anything else. As noted above, "I also noticed that having just a video playing fullscreen was less power than if I had the video playing and alt+tabbed to my power monitoring utility — it jumped by 10-40 watts. So I started the power collection 10 seconds before the test sequence, switched to the fullscreen video, collected data for 80 seconds, then switched back and stopped the data collection. Then I cut out the first and last 10 seconds."

That's how I measured power use. The video I was playing was the Colorado Avalanche vs Winnipeg Jets from Feb 24. It's a 720p source and almost certainly an H.264 encoded file. I was playing the video in Chrome on a PC with an i9-9900K, so not a super fast CPU but fast enough. Let me quick try this on a... 4070 Ti and just grab the numbers from MSI Afterburner for power use with … let's do this video (Ted Lasso Season 3 trailer, which is 1080p):
View: https://www.youtube.com/watch?v=IR9yjn7Lkdg


This is the "VSR Off" test. The spikes in power at the start/end are when the video isn't playing fullscreen.

191

Okay, here's the same thing, only this time it's with VSR On and quality set to 4:

192

There was a spike to 33W and the clocks both jumped partway into that test, and I don't know for sure what caused that — I have a lot of stuff open in different windows right now. Anyway, that seems pretty conclusive to me that YouTube, 1080p upscaled to 4K via VSR 4, doesn't need to use a ton of power.
 
  • Like
Reactions: bit_user
Other sites use Google Drive or another service to get around size restrictions. Hopefully too they will refine it quickly if Google decides to push forward on limiting 1920x1080 to Youtube Premium only.
Well, this is specifically for the uploaded images. Our CMS caps image size at 10MB. Most of the PNG files were under that, but not the Nvidia Campus screen captures. And honestly, you have to look really hard to find the JPG artifacts in a quality 12 JPG file. Like, even at 4X magnification, here's a shot of the non-VSR and VSR (I think you can click to get the original image rather than the 1024 pixel wide version):

193

Edit: Nope, our forum apparently converts to 1920 pixels wide, AFAICT. Still, you can look at those and JPG artifacts are just as likely to be video compression artifacts.
 
  • Like
Reactions: bit_user

zx128k

Reputable
That's with only the one screen? Are you in fullscreen mode on whatever video you're watching, or is it in a window (or in the background)? I would guess based on what I saw that having MSI Afterburner or GPU-Z running on top of the video will cause much higher power use. And what content are you watching — that probably also plays a role. Probably some video sources and video types are going to be more demanding, computationally. Like if you're on YouTube and you watch something that's AV1 encoded, then the GPU will definitely have to do more work just to decode. But trying to test lots of different video formats was definitely beyond the scope of what I wanted to try and do for this article.

But since you're on a laptop, it may mean you're not just measuring GPU power. I have a power testing setup that only measures power that goes to the graphics card — via PCIe slot plus up to three dual-8-pin cables. I am not checking CPU load or anything else. As noted above, "I also noticed that having just a video playing fullscreen was less power than if I had the video playing and alt+tabbed to my power monitoring utility — it jumped by 10-40 watts. So I started the power collection 10 seconds before the test sequence, switched to the fullscreen video, collected data for 80 seconds, then switched back and stopped the data collection. Then I cut out the first and last 10 seconds."

That's how I measured power use. The video I was playing was the Colorado Avalanche vs Winnipeg Jets from Feb 24. It's a 720p source and almost certainly an H.264 encoded file. I was playing the video in Chrome on a PC with an i9-9900K, so not a super fast CPU but fast enough. Let me quick try this on a... 4070 Ti and just grab the numbers from MSI Afterburner for power use with … let's do this video (Ted Lasso Season 3 trailer, which is 1080p):
View: https://www.youtube.com/watch?v=IR9yjn7Lkdg


This is the "VSR Off" test. The spikes in power at the start/end are when the video isn't playing fullscreen.

View attachment 191

Okay, here's the same thing, only this time it's with VSR On and quality set to 4:

View attachment 192

There was a spike to 33W and the clocks both jumped partway into that test, and I don't know for sure what caused that — I have a lot of stuff open in different windows right now. Anyway, that seems pretty conclusive to me that YouTube, 1080p upscaled to 4K via VSR 4, doesn't need to use a ton of power.

I was finding it hard to see any power draw different on a RTX 3080ti and MS Edge. I had it set to VSR 4.
 
I was finding it hard to see any power draw different on a RTX 3080ti and MS Edge. I had it set to VSR 4.
Yeah, feels to me like either Jtwizzle doesn't understand how to collect power data, or they did something quite different from what I outlined in order to get a spike from "30 watts vsr off [to] 140 (maxed) vsr on." That or laptops are behaving completely different than desktop when it comes to VSR. MSI Afterburner showed almost zero difference in my YouTube video test, for example, while my in-line power capture at least measured a small change. We'll see if Jtwizzle can provide any further details or if they just wander off into the ether having thrown shade at VSR. 🤷‍♂️
 
  • Like
Reactions: bit_user and zx128k

toco19

Distinguished
Apr 11, 2009
4
2
18,510
confirmed working on RTX3060 in Edge watching YouTube. Only upscaled from 1080p to 1440p and the effect is not drastic but is noticeable on the footage I've watched. Things seem not slightly blurry anymore. A nice improvement so far but sample size is small. Looks promising.