News Nvidia Image Scaling Runs On Radeon GPUs Thanks to Lossless Scaling

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I'm talking about exclusively from the DLSS point of view. It was a failed attempt at AI-assisted AA, like it or not.

The initial talking point from Nvidia presented it as "4K DLSS". It was presented as way to keep performance at an acceptable level with a presentation that had a similar quality as 4K with AA. And for the first releases you could only activate it while using a strict 4K resolution of 3840x2160.
 
The initial talking point from Nvidia presented it as "4K DLSS". It was presented as way to keep performance at an acceptable level with a presentation that had a similar quality as 4K with AA. And for the first releases you could only activate it while using a strict 4K resolution of 3840x2160.
The initial version of DLSS required every resolution for every game that supported DLSS to be specifically trained. So it made sense to prioritize 4k, though there were games that supported other resolutions early on.
 
  • Like
Reactions: JarredWaltonGPU
FFXV is restricted to 4K while using DLSS.

Anyway the point was that it was not initially conceived as an AA technology.
 
I'm talking about exclusively from the DLSS point of view. It was a failed attempt at AI-assisted AA, like it or not.

For everything else they use the Tensor cores for: cool, I guess. I have no idea why anyone would want their GPU using part of the die/power not helping render more frames, but I guess that's just me?

As for the rest of what you said. I don't disagree entirely. Sure, it is nice for people that can't upgrade and whatnot, but I will not believe for even a microsecond nVidia wanted this tech to work as an upscaling when they were pushing their DSR approach heavily. DLSS was meant to work in tandem with DSR, but it just turned out to be a happy mistake. Also, it's quite ironic they locked DLSS behind tensor cores alone and not pushed NIS before FSR came around when they could've if they wanted to "help those with older cards". Do not fool yourself there, come on.

Regards.
DLSS won't run well without higher compute than (most) regular GPU cores can provide, and it is very different from NIS or FSR in a lot of ways. All you have to do is look at it in action. The only truly comparable approach announced is Intel's XeSS. We'll see how XeSS compares when it comes out, and how it runs on the various GPUs. Sadly, it will only use Intel's version of tensor cores in full performance mode, while the non-Intel GPUs will use DP4a (integers) to try and accomplish the same thing. Will the end results look the same? I'm curious to see. Also curious how well XeSS will run in "compatible" DP4a mode. But fundamentally FSR is not DLSS, and it . It's spatial upscaling via a lanczos filter with extra edge detection and sharpening. We've had lots of upscalers for years, of varying quality and performance. None look as good as native, ever, but DLSS sometimes (not always) looks better.

FSR can look better than native with TAA, but only because TAA is a crappy blur-fest. FSR doesn't look better than native with AMD's CAS, because CAS is basically what FSR does, only without the upscaling. I actually like CAS way more than FSR, and hopefully AMD can work more on that sort of algorithm. Actually, we just need TAA to go away and maybe get replaced by something better.

DLSS was never EVER discussed as having anything to do with DSR, though. Digital Super Resolution was for people who had GPU power to spare and a way to get supersampling. DLSS was originally announced in two modes, the regular one we normally see, and DLSS 2x, which was just DLSS without the upscaling. DSR was always about running at higher than native and then downsampling to get superior image quality. DLSS might be a network trained on a similar idea I suppose, but it operates in reverse: take the base image and then try to determine what the original higher quality image would have looked like. In real time, at >60 fps.

It's a difficult problem, which is why it needs tensor cores. At 100 fps, trying to "intelligently" upscale from 1080p to 4K just isn't going to happen without a ton of compute. But look at some of the AI image upscalers on the web, that will take a source 640x360 image and turn it into a relatively nice looking 1920x1080 image. Sure, it's hard work and takes a server a minute or more, but that's because it's probably doing a bunch of images from lots of people. Anyway, I look forward to the day when that can all happen in real-time in a PC game.
 
Here's a fun example, though. I started with a 1080p source image from Battlefield 2042. (Sorry, these may be large, but let's see how it goes.)
Edit: Looks like the forum ditches the 4K images and replaces with 1080p, processed on the server. Sigh. Oh well, you can still see interesting comparisons with the Deep Image vs. Photoshop 720p upscales.

bf2042-1080p-jpg.102


Then I upscaled that to 4K in Adobe Photoshop using Nearest Neighbor...

103

...as well as the "Preserve Details" options.

104

That's "DLSS Performance" or "FSR Performance" mode equivalent, though it runs at about 3 seconds per image upscale. LOL. Next, I resized the image to 720p:

105

Now I'll upscale that to 4K with Photoshop "Preserve Detail" -- the equivalent of DLSS Ultra-Performance Mode.

106

Hopefully it should be obvious (especially if you look at the full size images) that this has a severe drop in quality compared to the previous upscale -- there's not enough information. It's interesting to see what a good AI upscale can do, though. Here's 720p to 4K using Deep Image AI:

107

That is decidedly not perfect, particularly on high contrast diagonals, and it took nearly a minute for the Deep Image server to generate that image. Still, you can see how with a bit of tuning it might look nearly as good as native 4K. That's the impetus behind DLSS, in essence, with the "tuning" involving the use of data from previous frames to help detect and anti-alias those high contrast diagonals. That's something FSR doesn't even attempt to do, which is very obvious when games are in motion.
 

Attachments

  • BF2042-1080p.jpg
    BF2042-1080p.jpg
    382.3 KB · Views: 39
  • Like
Reactions: ottonis
Lossless Scaling recently got upd to support Nvidia's own spatial image upscaler, allowing anyone with any GPU to run Nvidia's newly upd upscaler in any game.

Nvidia Image Scaling Runs On Radeon GPUs Thanks to Lossless Scaling : Read more
once again AMD set the trend... like intel that launch a new CPU every year trying to beat AMD and after 3rd attempt barely wins the race. FSR set the pace, now nVidia jumping on and claiming all the fame, while AMD sits back, again, and just smile at the politicians
 
One question: can the "lossless scaling" app be used as a video or image upscaler outside of games? Like Topaz'Gigapixel and video enhancer AI?
 
once again AMD set the trend... like intel that launch a new CPU every year trying to beat AMD and after 3rd attempt barely wins the race. FSR set the pace, now nVidia jumping on and claiming all the fame, while AMD sits back, again, and just smile at the politicians

nope it was nvidia. AMD will not going to develop thing like FedelityFX CAS or FSR if not because of what nvidia has done with DLSS. NIS? is more of nvidia attempt to tell the public that FSR is not even competing with DLSS.
 
I'm currently Pro-AMD and that's not something I'd say... Less generalising please. :)
Not all of them make it, but the die-hards do. Like I said, and it was directed at the poster, claims about Nvidia intentionally making older GPUs run slower (through driver updates) have been debunked. Suggesting it happens, which is what he did, is FUD.