News Nvidia Reveals DLSS 3.5: AI-Powered Ray Reconstruction

Status
Not open for further replies.

rluker5

Distinguished
Jun 23, 2014
644
386
19,260
Now that they've come out with it, it seems like an obvious application for ai.
I like that I'll be able to use it with my 3080 even though it will be slower than the 40 series.

I wonder what cumbersome effect will be next? AI fog?
 
Now that they've come out with it, it seems like an obvious application for ai.
I like that I'll be able to use it with my 3080 even though it will be slower than the 40 series.

I wonder what cumbersome effect will be next? AI fog?
It's surprising to me that this one took so long to materialize. Probably it was more about getting a good algorithm (in terms of quality and performance) in place, as well as the rest of the framework. Since this only benefits ray tracing games, that's a relatively small subset of the overall market. Traditional denoising algorithms have already been tackling this problem, but like so many other things, tying this to AI and deep learning is the new hotness I guess.
 

randyh121

Prominent
Jan 3, 2023
257
51
770
This is awesome. Really no reason to ever get any AMD product when their 'software' (if you can really even call it that) can do nowhere NEAR what NVIDIA is doing.
Absolutely amazing. Might finally upgrade my 2070S if they are not greedy in price for their 5000 series!
 
  • Like
Reactions: kiniku

user7007

Commendable
Mar 9, 2022
38
29
1,560
I'm not convinced yet ray tracing is a killer app/must have feature... but DLSS 3.5 seems like a decent upgrade to me if you're using ray tracing. If it's not slower (might even be faster) and looks a bit better, seems like a win.
 

CharlesOCT

Prominent
Jan 4, 2023
15
33
540
pretty awesome that this improvement now gives you directionally shaped headlights... if you're playing RT overdrive on a $1600 4090 in cyberpunk at 50fps on x1440. thanks nvidia!
 

bit_user

Polypheme
Ambassador
@JarredWaltonGPU thanks for posting the entire slide deck, as usual. I often find it worthwhile to click through them all.

Traditional denoising algorithms have already been tackling this problem, but like so many other things, tying this to AI and deep learning is the new hotness I guess.
Now, in reference to the above, I'd point out they walk through several deficiencies of traditional denoising algorithms, in those very slides.

5MfcziA8viVSeSoLmD84uW.jpg


9JDVFPGJtCSBhUZcseBZMX.jpg


Kcts6GoKC35tqjzFLTEUiX.jpg


Ez4DutH3xW4MzNreNAaRMY.jpg


So, not only does DLSS 3.5 do a better job than hand-tuned denoisers, but it also saves developers the time & effort of trying to hand-tune their denoisers to work even as well as the examples we see above! For game devs, the only negative would seem to be the limited hardware support.

Basically, any time you're trying to solve a problem using heuristics, rather than a closed-form solution, deep learning is generally going to do a better job (provided an adequately sophisticated model & enough training data).
 

bit_user

Polypheme
Ambassador
Yawn. The frames are still fake.
So is graphics.

I find it funny that I don't recall seeing virtually any outrage over techniques like VRS, even though it's also a smart interpolation technique. I know frame generation is more controversial, but if you're just talking about smart upsampling/reconstruction, the distinction would seem to be fairly arbitrary.

I'd rather take ray tracing with DLSS 3.5 than traditional polygon-driven rendering. That's a lot more faithful to the underlying physics, even with DLSS.

Oh, and one more thing... your brain is also doing neural reconstruction and interpolation. So, there's that.
 

bit_user

Polypheme
Ambassador
How exactly does Nvidia coming up with a new feature or hardware keep anyone "locked in"?
As mentioned in the article, games must be written to use the technology. Most developers probably won't put the effort into supporting DLSS and the corresponding AMD and Intel counterparts.

In other words, it forces game developers to pick a side. After investing in one technology, its learning curve, and adapting your code to use it, now it's hard to switch. That's the "lock-in" effect. And once they lock in the developers, the gamers will follow.
 

bit_user

Polypheme
Ambassador
I'm not convinced yet ray tracing is a killer app/must have feature...
What I expect will happen is that once it becomes common enough, game engines will focus most of their effects and quality enhancements on their ray tracing backend. You'll still be able to play on non-RT hardware, but the gap in quality will increase more and more.
 
  • Like
Reactions: rluker5

rluker5

Distinguished
Jun 23, 2014
644
386
19,260
It's surprising to me that this one took so long to materialize. Probably it was more about getting a good algorithm (in terms of quality and performance) in place, as well as the rest of the framework. Since this only benefits ray tracing games, that's a relatively small subset of the overall market. Traditional denoising algorithms have already been tackling this problem, but like so many other things, tying this to AI and deep learning is the new hotness I guess.
An awful lot of traditional rasterization is just kind of hacked together and has both good and bad. I'm sure there will be limitations and flaws with using AI approximations as well, but there may also be some things that are better.

This new AI hotness combined with Nvidia's current market dominance could lead to a rise of a new wave of gameworks.
Imagine smoky bullets in the wind with mixed lighting, using moving smoke for cover in stealth, highly interactive and destructible environments, cheap and good hair, etc. If they are made with the assistance of the AI available in the Nvidia cards that make up most of the market they will be popular if they look good, the userbase will be large enough for many devs to put them in and will also be an Nvidia exclusive.

I'm not saying that is entirely good or bad, but if I see new effects I like, I'd rather enjoy them than protest.
 
What I expect will happen is that once it becomes common enough, game engines will focus most of their effects and quality enhancements on their ray tracing backend. You'll still be able to play on non-RT hardware, but the gap in quality will increase more and more.
dont expect that much, few handpicked games will support it and thats it
consoles still have over 50% player base, thats where focus goes
 
  • Like
Reactions: LoordOfTauCeti

umeng2002_2

Commendable
Jan 10, 2022
193
174
1,770
It's a misleading description by nVidia. It doesn't use AI to make more rays. It's just a more finely tuned AI denoiser to look for RT artifacts.

This isn't an implementation importance sampling of ReSTIR.
 

PEnns

Reputable
Apr 25, 2020
640
669
5,760
DLSS is just another version of the HP ink cartridge bone thrown at its customers.

And amazing how quickly companies are throwing AI flavored fluff at us - in such a short period of time!!
 
  • Like
Reactions: NeoMorpheus
An awful lot of traditional rasterization is just kind of hacked together and has both good and bad. I'm sure there will be limitations and flaws with using AI approximations as well, but there may also be some things that are better.

This new AI hotness combined with Nvidia's current market dominance could lead to a rise of a new wave of gameworks.
Imagine smoky bullets in the wind with mixed lighting, using moving smoke for cover in stealth, highly interactive and destructible environments, cheap and good hair, etc. If they are made with the assistance of the AI available in the Nvidia cards that make up most of the market they will be popular if they look good, the userbase will be large enough for many devs to put them in and will also be an Nvidia exclusive.

I'm not saying that is entirely good or bad, but if I see new effects I like, I'd rather enjoy them than protest.
The thing is, I was at the RTX 20-series launch at Gamescom 2018. I remember Nvidia showing slides at one point (maybe it was post-Gamescom) where they had Finding Nemo and some other Pixar stuff and were talking about AI-based denoising. So, this stuff isn't a new idea at all; Nvidia had mentioned it back in the 20-series days. (That's an article I helped write in 2019, but I'm pretty sure the Pixar stuff was shown at the 20-series reveal.)

In fact, I'll go a step further. I remember asking, "So are the tensor cores being used to do this denoising?" The answer was no, not yet at least. I've asked variants of that question many times over the past five years. "Why isn't the denoising being handled via tensor cores?" Apparently the answer was that it wasn't ready yet, and I figured as much, but it's still surprising to me that it took this long.
 
  • Like
Reactions: KyaraM and rluker5

bit_user

Polypheme
Ambassador
The developers are not locked into anything against their will by Nvidia.
No one said there was. Vendor lock-in usually works by baiting the victim to adopt a proprietary technology. They can quit any time they like, but the idea is to make sure the immediate costs of doing so always outweigh the short-term benefits of breaking free. A bit like the Hotel California.

They simply offer their product, as does every other manufacturer and developer.
Some competing technologies from Intel and AMD aren't tied to their hardware (although they tend to be better optimized for it). So, it's not as if there's equivalence between all of them.
 
Last edited:

bit_user

Polypheme
Ambassador
In fact, I'll go a step further. I remember asking, "So are the tensor cores being used to do this denoising?" The answer was no, not yet at least. I've asked variants of that question many times over the past five years. "Why isn't the denoising being handled via tensor cores?" Apparently the answer was that it wasn't ready yet, and I figured as much, but it's still surprising to me that it took this long.
I'd guess the reason it took so long is that they probably prioritized the general case of improving DLSS across a wide variety of content, before sinking a lot of time into optimizing it for ray tracing & global illumination. In fact, such early hardware (RTX 2000) is still probably on the slow end of the scale to use global illumination, even with this new ray reconstruction.
 
Status
Not open for further replies.