News Nvidia Image Scaling Runs On Radeon GPUs Thanks to Lossless Scaling

I really, really don't think nvidia did this from the "goodness" of their hearts.... ;)

Thanks AMD for FSR, making it open for everyone. NIS is nvidia's response to that. Gamers win.

Now we need XeSS to join the party.
Sure it's good, but I don't see the point. NVIDIA has had an image sharpening feature in the driver set for years now. At some point (or maybe it was added at the same time), it also included a non-DLSS upscaler. If you have an AMD GPU, AMD has their own upscaler in the driver utilities.

So I'm not even sure why this app exists unless there's some restriction on the driver based upscalers that I'm missing. Or why driver based upscalers aren't just using FSR or whatever by default.

In the high-res image, the DLSS 4K scaling is cleaner than native 4K, which seems quite a bit suspicious?
DLSS in some cases can produce results that may look better than the native 4K resolution:
 

d0x360

Distinguished
Dec 15, 2016
115
47
18,620
In the high-res image, the DLSS 4K scaling is cleaner than native 4K, which seems quite a bit suspicious?

Its not that it's cleaner, it's more like the ai part is adding detail that wasn't there or making thin lines thicker. The satellite dish is a perfect example of this. The problem is sometimes it adds detail when it shouldn't. A great example is death stranding. Backpack stuff gets damaged but DLSS tends to remove the effect because it thinks the image is missing data.

Native 4k looks better. nVidia's DLSS sharpening pass has the effect of making the entire image brighter when you move the camera. Check out RDR2 for the best example of this because of the grass.

The problem with 4k is TAA blurs stuff too much and sharpening doesn't bring that detail back but TAA is the best we have until DLAA comes out. Hopefully that doesn't have the same sharpen pass as DLSS.
 
Apr 1, 2020
1,447
1,100
7,060
Hmm, pay $5 to play using lower image quality, or pay $0 and play using lower image quality settings...I think I'll pay $0 and play at a lower quality, especially if the example high resolution pic is typical of the results of really horrible quality.
 

Integr8d

Distinguished
May 28, 2011
162
66
18,760
Its not that it's cleaner, it's more like the ai part is adding detail that wasn't there or making thin lines thicker. The satellite dish is a perfect example of this. The problem is sometimes it adds detail when it shouldn't. A great example is death stranding. Backpack stuff gets damaged but DLSS tends to remove the effect because it thinks the image is missing data.

Native 4k looks better. nVidia's DLSS sharpening pass has the effect of making the entire image brighter when you move the camera. Check out RDR2 for the best example of this because of the grass.

The problem with 4k is TAA blurs stuff too much and sharpening doesn't bring that detail back but TAA is the best we have until DLAA comes out. Hopefully that doesn't have the same sharpen pass as DLSS.

I think there's something up with this. The text in the native 4K is already kinda trash. But now I'm supposed to believe that DLSS took an even lower res version of that picture, which squished the text down to even more poo, and then AI magicky upscaled it to have even cleaner text than the native image???
 

Blacksad999

Reputable
Jun 28, 2020
70
48
4,570
I really, really don't think nvidia did this from the "goodness" of their hearts.... ;)

Thanks AMD for FSR, making it open for everyone. NIS is nvidia's response to that. Gamers win.

Now we need XeSS to join the party.

Nvidia has had this available in the Control Panel for years now. It just didn't have a "one button click" option.
 

Alex/AT

Reputable
Aug 11, 2019
33
17
4,535
In the high-res image, the DLSS 4K scaling is cleaner than native 4K, which seems quite a bit suspicious?
It's not actually. Don't look at the 'framed' letterbox, it is the place where so-called 'AI' scaler may do its own distortion making it look 'cleaner' on the static image.
Look at the circular button set below the 'letter screen' on the image. It's pure crap under any scaler, and only 4K native resolution looks good.
 

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
Nvidia has had this available in the Control Panel for years now. It just didn't have a "one button click" option.
Didn't they also improve it too? Because that's because of FSR competition... they did not need to focus on this free tool at all without FSR pushing them and making their DLSS look bad for being a premium black box.

I still think all of this renewed interest and promo of NIS (also improvement) is because FSR exists and how it works, easy and free.
 
This is just marketing wars...

I still say DLSS, FSR and upcoming XeSS are not things to look forward to or things you want necessarily. Upscaling is a short-lived solution to a computational problem we don't really want to have, right? The reason why you all think you want DLSS is because nVidia's marketing machine is powerful. Think to yourselves, why you'd want to run your games at lower resolution and scale them up with visual defects over native with same (or lowered) settings? I personally haven't used DLSS, but I haven't read many wonderful things from games that do support it and it is still bad. FSR is ok-ish, but you still can't use it everywhere (as in all resolutions and scope is very limited selection of games). And XeSS is not even out yet.

Then NIS. It's been there for ages to be used by people, but it hasn't because of what I just said. Just aim to run the games at native resolutions or this will be a race to the bottom.

Regards.
 

mo_osk

Reputable
Nov 13, 2020
33
16
4,535
I think there's something up with this. The text in the native 4K is already kinda trash. But now I'm supposed to believe that DLSS took an even lower res version of that picture, which squished the text down to even more poo, and then AI magicky upscaled it to have even cleaner text than the native image???

Nothing suspicious there, upscalling an image with text is like the dream textbook application for AI upscaling. If you have and nvidia card and a game compatibl with DLSS its very easy to verify that any texture that has text on it will greatly benefit from AI upscaling.
 

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
This is just marketing wars...

I still say DLSS, FSR and upcoming XeSS are not things to look forward to or things you want necessarily. Upscaling is a short-lived solution to a computational problem we don't really want to have, right? The reason why you all think you want DLSS is because nVidia's marketing machine is powerful.

Nvidia didn't develop DLSS to be a universal upscaler. It was designed to be used in conjunction with ray tracing to give playable framerates. So, the actual purpose of DLSS was to enable rendering techniques and graphical effects that wouldn't be possible if run at today's common resolutions. Over time it's been marketed more and more as a universal upscaler, but that wasn't the original intent.

Then NIS. It's been there for ages to be used by people, but it hasn't because of what I just said. Just aim to run the games at native resolutions or this will be a race to the bottom.
The main use case for NIS is if you're playing old games, usually console games, on your PC that run natively at a really low resolution and you want a method to more cleanly upscale them to whatever your current resolution is, so you don't need to have them in a really tiny window in the middle of your screen to have a sharp image. It's a niche market which is why it's been around for years and most people have never heard of it. It's not a performance enhancer. You would not use NIS on any game that already runs at your native resolution.
 

mo_osk

Reputable
Nov 13, 2020
33
16
4,535
This is just marketing wars...

I still say DLSS, FSR and upcoming XeSS are not things to look forward to or things you want necessarily. Upscaling is a short-lived solution to a computational problem

DLSS and XeSS is not really upscalling. We use this terminology because its easier to conceive what it do that way but this is really shorselling what these technology do. Its a solution to a lot of problem when rendering image, for a start its a way to almost freely eliminate all form of aliasing. Being able to use an AI algorithm instead of computational solution to preserve image quality or even enhance it means that a lot of resources will be available for other things.
 
  • Like
Reactions: JarredWaltonGPU
In the high-res image, the DLSS 4K scaling is cleaner than native 4K, which seems quite a bit suspicious?
This is why DLSS can actually be better. Because it uses temporal data (ie, previously rendered frames), it can pull in more data. Think of it this way:

  1. Native 4K rendering has 3840x2160 pixels of data (8294400 pixels)
  2. Spatial 1080p upscaling has 1920x1080 pixels of data (2073600 pixels)
  3. DLSS Performance has multiple frames of 1080p data (2073600 * number of frames)

Depending on the number of frames of input used -- and it could be five or six, Nvidia has never really said -- DLSS potentially has more pixels of information from which to create a pleasing image. Of course this works best when you're not running around and changing the frames a lot. But even in motion, DLSS -- especially the latest 2.3 variant -- does an impressive job.
 
  • Like
Reactions: Integr8d
Nvidia didn't develop DLSS to be a universal upscaler. It was designed to be used in conjunction with ray tracing to give playable framerates. So, the actual purpose of DLSS was to enable rendering techniques and graphical effects that wouldn't be possible if run at today's common resolutions. Over time it's been marketed more and more as a universal upscaler, but that wasn't the original intent.
LOL, no.

DLSS is a failed AA algorithm "powered" by their tensor cores. It was just a "happy" mistake. Why would a company that aims for you to buy hardware would want to give you a way to keep using your cards for longer? Specially nVidia? But I guess they had to spin it that way since the original incarnation was laughably bad, but at least it helped make the penalty hit of RayTracing feel less bad.

The main use case for NIS is if you're playing old games, usually console games, on your PC that run natively at a really low resolution and you want a method to more cleanly upscale them to whatever your current resolution is, so you don't need to have them in a really tiny window in the middle of your screen to have a sharp image. It's a niche market which is why it's been around for years and most people have never heard of it. It's not a performance enhancer. You would not use NIS on any game that already runs at your native resolution.
Fair.

Regards.
 

russell_john

Honorable
Mar 25, 2018
115
81
10,660
I really, really don't think nvidia did this from the "goodness" of their hearts.... ;)

Thanks AMD for FSR, making it open for everyone. NIS is nvidia's response to that. Gamers win.

Now we need XeSS to join the party.

Do you think AMD made FSR out of the goodness of their hearts? Highly doubtful ..... They did it to make more money and attempt to grab a portion of Nvidia's Market Share which they desperately needed to do because they are losing ground (A year ago it was 80% Nvidia to 20% AMD, currently it's 83% Nvidia to 17% AMD) Nvidia sell 4 GPUs for every 1 AMD sells despite current IC shortages. Nvidia is also worth more than AMD and Intel combined. Like it or not they are a juggernaut that's not going away anytime soon

Nvidia has had this in their drivers for quite some time, all they did was make some minor improvements to their already existing technology. They probably rolled over some of the recent improvements in DLSS and added some of that to NIS
 
  • Like
Reactions: JarredWaltonGPU
LOL, no.

DLSS is a failed AA algorithm "powered" by their tensor cores. It was just a "happy" mistake. Why would a company that aims for you to buy hardware would want to give you a way to keep using your cards for longer? Specially nVidia? But I guess they had to spin it that way since the original incarnation was laughably bad, but at least it helped make the penalty hit of Ray Tracing feel less bad.
Sorry, but that's a completely ignorant viewpoint. Tensor cores first appeared in Volta, which was data center focused. Looking at where AI was going, Nvidia clearly wanted to put tensor cores into more products, including consumer products. There was already plenty of research in 2015 (and before) to show that machine learning was a fast-growing field and will continue to be so for a long time. Once the decision was made to put tensor cores into GeForce, of course Nvidia would start to look at what it could do with the tech. DLSS was the first announced productized version, as image upscaling and enhancement tools were already coming into existence. There was no "mistake" about any of this.

Now tensor cores are being used for cleaning up voice, background detection and blur/replacement, self-driving cars, medical, environmental, energy, and many more areas of research. Not surprisingly, Intel is taking the exact same approach with Arc. It will have crappy ray tracing support, but it looks like the tensor cores will be quite competitive. Do you think that's also a "happy mistake?"

And while DLSS sort of allows you to use a card for longer as a gaming solution, you can accomplish the same thing -- extending the useable life of a GPU -- by turning down graphics settings. The reality is Nvidia will always be working on something newer, faster, and better. Nvidia doesn't need to intentionally make old products obsolete. That's just a claim some anti-Nvidia / pro-AMD people make because it sounds plausible, but research has proven it's never really true. In fact, Nvidia tends to support old/obsolete GPUs longer than AMD.

Case in point: AMD has ended driver support for R9 Fury and earlier GPUs. Those were released in 2015, so six years of driver support. Nvidia just stopped supporting GTX 700 and GTX 600 series GPUs in September (472.12 drivers). Those cards were released in 2013 and 2012, so eight and nine years of support, respectively. (GTX 900-series still has active support, even though it's seven years old now.) And I can tell you from personal experience, trying to use a 600-series or R9 300-series GPU in modern games is not great.

Nvidia isn't perfect. Neither is AMD, Intel, or any other company. The push for ray tracing in games was arguably done too early, but at the same time, I remember the push for shaders, hardware transform and lighting, and other GPU technologies that no one would dream of eliminating now. Someone had to get the ball rolling, and Nvidia once again leveraged its dominant position to do so, at the same time jump starting ray tracing hardware in the professional sector. DLSS 1.0 was premature and represented a quick and dirty first effort while the research continued. DLSS 2.0 fixed a lot of the shortcomings, and now DLSS 2.3 takes things up yet another level. And guess what? They all run even on an RTX 2060, greatly extending its capabilities in games that support DLSS.
 
Sorry, but that's a completely ignorant viewpoint. Tensor cores first appeared in Volta, which was data center focused. Looking at where AI was going, Nvidia clearly wanted to put tensor cores into more products, including consumer products. There was already plenty of research in 2015 (and before) to show that machine learning was a fast-growing field and will continue to be so for a long time. Once the decision was made to put tensor cores into GeForce, of course Nvidia would start to look at what it could do with the tech. DLSS was the first announced productized version, as image upscaling and enhancement tools were already coming into existence. There was no "mistake" about any of this.

Now tensor cores are being used for cleaning up voice, background detection and blur/replacement, self-driving cars, medical, environmental, energy, and many more areas of research. Not surprisingly, Intel is taking the exact same approach with Arc. It will have crappy ray tracing support, but it looks like the tensor cores will be quite competitive. Do you think that's also a "happy mistake?"

And while DLSS sort of allows you to use a card for longer as a gaming solution, you can accomplish the same thing -- extending the useable life of a GPU -- by turning down graphics settings. The reality is Nvidia will always be working on something newer, faster, and better. Nvidia doesn't need to intentionally make old products obsolete. That's just a claim anti-Nvidia / pro-AMD people make because it sounds plausible, but research has proven it's never really true. In fact, Nvidia tends to support old/obsolete GPUs longer than AMD.

Case in point: AMD has ended driver support for R9 Fury and earlier GPUs. Those were released in 2015, so six years of driver support. Nvidia just stopped supporting GTX 700 and GTX 600 series GPUs in September (472.12 drivers). Those cards were released in 2013 and 2012, so eight and nine years of support, respectively. (GTX 900-series still has active support, even though it's seven years old now.) And I can tell you from personal experience, trying to use a 600-series or R9 300-series GPU in modern games is not great.

Nvidia isn't perfect. Neither is AMD, Intel, or any other company. The push for ray tracing in games was arguably done too early, but at the same time, I remember the push for shaders, hardware transform and lighting, and other GPU technologies that no one would dream of eliminating now. Someone had to get the ball rolling, and Nvidia once again leveraged its dominant position to do so, at the same time jump starting ray tracing hardware in the professional sector. DLSS 1.0 was premature and represented a quick and dirty first effort while the research continued. DLSS 2.0 fixed a lot of the shortcomings, and now DLSS 2.3 takes things up yet another level. And guess what? They all run even on an RTX 2060, greatly extending its capabilities in games that support DLSS.
I'm talking about exclusively from the DLSS point of view. It was a failed attempt at AI-assisted AA, like it or not.

For everything else they use the Tensor cores for: cool, I guess. I have no idea why anyone would want their GPU using part of the die/power not helping render more frames, but I guess that's just me?

As for the rest of what you said. I don't disagree entirely. Sure, it is nice for people that can't upgrade and whatnot, but I will not believe for even a microsecond nVidia wanted this tech to work as an upscaling when they were pushing their DSR approach heavily. DLSS was meant to work in tandem with DSR, but it just turned out to be a happy mistake. Also, it's quite ironic they locked DLSS behind tensor cores alone and not pushed NIS before FSR came around when they could've if they wanted to "help those with older cards". Do not fool yourself there, come on.

Regards.
 

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
Do you think AMD made FSR out of the goodness of their hearts? Highly doubtful ..... They did it to make more money and attempt to grab a portion of Nvidia's Market Share which they desperately needed to do because they are losing ground (A year ago it was 80% Nvidia to 20% AMD, currently it's 83% Nvidia to 17% AMD) Nvidia sell 4 GPUs for every 1 AMD sells despite current IC shortages. Nvidia is also worth more than AMD and Intel combined. Like it or not they are a juggernaut that's not going away anytime soon

Nvidia has had this in their drivers for quite some time, all they did was make some minor improvements to their already existing technology. They probably rolled over some of the recent improvements in DLSS and added some of that to NIS
1. You cannot take away a good deed AMD did with FSR (of course it serves them too) and turn it into nothing, just because you want to or say so. Intel did not do it, nvidia did not do it either, but AMD did and gave us an open source solution for free to everyone, with FSR. That's a fact.

2. NIS is now in focus for nvidia to get good PR points back after all these years when all they did was promote their premium version for their RTX lineup ("buy our RTX GPUs for DLLS and RT"), because until now GTX owners were dirt in their eyes. They either bought the premium or they could **** off, as far as nvidia cared. FSR made them bring NIS into the foreground and make it better. Without FSR, NIS would have been left forgotten and only DLSS would have been the focus, forever.

3. I know nvidia is not going anywhere (soon, or ever), but I really have no love left for them after owning for more than 10 years only their GPUs, because of all the scummy things they did all these years. Just because a company is big, or "too big to fail", does not mean I like it. The bigger it gets the worse it is, usually. AMD can get into that issue too, if it gets too big and starts acting like the bigger douche-bags. When/if they do that I won't like them either.
 

Integr8d

Distinguished
May 28, 2011
162
66
18,760
This is why DLSS can actually be better. Because it uses temporal data (ie, previously rendered frames), it can pull in more data. Think of it this way:

  1. Native 4K rendering has 3840x2160 pixels of data (8294400 pixels)
  2. Spatial 1080p upscaling has 1920x1080 pixels of data (2073600 pixels)
  3. DLSS Performance has multiple frames of 1080p data (2073600 * number of frames)
Depending on the number of frames of input used -- and it could be five or six, Nvidia has never really said -- DLSS potentially has more pixels of information from which to create a pleasing image. Of course this works best when you're not running around and changing the frames a lot. But even in motion, DLSS -- especially the latest 2.3 variant -- does an impressive job.
This I actually buy.

Post editing this to include: I remember Lucasfilm was working on a temporal scaling technology to do the same thing with VHS. The 200-or-so lines of resolution would be analyzed before and after the targeted frame to infer the missing information and to add it back in. From what I can remember, it was super clean.