Discussion NVIDIA announces DLDSR (Deep Learning Dynamic Super Resolution), an AI-powered Tech !

It appears this annoucement came out of nowhere, but this looks like a direct response to the recently announced AMD RSR tech, IMO.

NVIDIA has dropped some unexpected news on a new graphics-enhancing feature that will go live with the next Game Ready drivers, dubbed as DLDSR, or Deep Learning Dynamic Super Resolution). It's exactly what it sounds like, an AI-powered version of the Dynamic Super Resolution or DSR that's been available via the Control Panel for several years.

According to NVIDIA, DLDSR could be up to twice as efficient while maintaining similar quality. In the example image as shown below in this post, we can see Prey running at nearly the same frame rate of the native 1080p resolution, all the while actually rendering at 1620p resolution for crisper definition. DLDSR will be available to GeForce RTX owners, as it is powered by the Tensor Cores, and should work in 'most' games.

Additionally, NVIDIA partnered with renowned ReShade modder Pascal Gilcher (also known as Marty McFly) to implement modified versions of his depth-based filters through GeForce Experience's Freestyle overlay. That includes the popular ray-traced global illumination shader.
  • SSRTGI (Screen Space Ray Traced Global Illumination), commonly known as the “Ray Tracing ReShade Filter” enhances lighting and shadows of your favorite titles to create a greater sense of depth and realism.
  • SSAO (Screen Space Ambient Occlusion) emphasizes the appearance of shadows near the intersections of 3D objects, especially within dimly lit/indoor environments.
  • Dynamic DOF (Depth of Field) applies bokeh-style blur based on the proximity of objects within the scene giving your game a more cinematic suspenseful feel.






Here's the Press release statement, to quote NVIDIA:

"Advanced Freestyle Filters.

Our January 14th Game Ready Driver updates the NVIDIA DSR feature with AI. DLDSR (Deep Learning Dynamic Super Resolution) renders a game at higher, more detailed resolution before intelligently shrinking the result back down to the resolution of your monitor. This downsampling method improves image quality by enhancing detail, smoothing edges, and reducing shimmering.

DLDSR improves upon DSR by adding an AI network that requires fewer input pixels, making the image quality of DLDSR 2.25X comparable to that of DSR 4X, but with higher performance. DLDSR works in most games on GeForce RTX GPUs, thanks to their Tensor Cores .""
 
What boggles my mind is why this wasn't a thing before. If DLSS originally required 64x SSAA to train the AI on, why couldn't it be applied to enhance the quality of the image at 100% rendering scale?

I'll just quietly file this under the same category as "why did it take NVIDIA so long to support VESA adaptive sync?"
 

jasonf2

Honorable
Oct 11, 2015
615
153
11,390
64
What boggles my mind is why this wasn't a thing before. If DLSS originally required 64x SSAA to train the AI on, why couldn't it be applied to enhance the quality of the image at 100% rendering scale?

I'll just quietly file this under the same category as "why did it take NVIDIA so long to support VESA adaptive sync?"
My guess is that the AMD release of similar tech is just forcing Nvidia's hand here. They are literally playing tit for tat without hardware upgrade. This will represent a pretty significant performance boost for lower and mid tier cards with the tensor core support. Without it though AMD RSR would have left a competitive edge in the midrange field.

To the vesa thing. Why support a standard when you already have a proprietary tech that does the same thing that people have to pay extra for?
 

renz496

Champion
My guess is that the AMD release of similar tech is just forcing Nvidia's hand here. They are literally playing tit for tat without hardware upgrade. This will represent a pretty significant performance boost for lower and mid tier cards with the tensor core support. Without it though AMD RSR would have left a competitive edge in the midrange field.

To the vesa thing. Why support a standard when you already have a proprietary tech that does the same thing that people have to pay extra for?
DSR is doing the opposite of RSR. nvidia just add machine learning stuff to it. it is a feature that you use when you already have extra performance to tap. now with DL they just make it less demanding. RSR/FSR is being used because you don't even have enough performance at native res.
 
My guess is that the AMD release of similar tech is just forcing Nvidia's hand here. They are literally playing tit for tat without hardware upgrade. This will represent a pretty significant performance boost for lower and mid tier cards with the tensor core support. Without it though AMD RSR would have left a competitive edge in the midrange field.
I don't think NVIDIA's hand was forced. I'd have to see if it's a thing, but if DSR can be combined with the image sharpening/scaling feature, then it's basically RSR.

To the vesa thing. Why support a standard when you already have a proprietary tech that does the same thing that people have to pay extra for?
Because it allows NVIDIA to make the competition less appealing. Why buy AMD products when NVIDIA offers both FreeSync support and G-Sync support? People like options don't they? So why not buy the product that gives you more?
 

jasonf2

Honorable
Oct 11, 2015
615
153
11,390
64
Because it allows NVIDIA to make the competition less appealing. Why buy AMD products when NVIDIA offers both FreeSync support and G-Sync support? People like options don't they? So why not buy the product that gives you more?
[/QUOTE]

Sounds good in theory but in reality Nvidia is selling the chips that make G-sync happen. So when they have a product performance advantage with their video card that is going to sell the card regardless only supporting g-sync will seal the deal for the monitor sale that has g-sync. Supporting an open standard only puts them in a spot where they have to pay additional licensing fees.
 

MasterMadBones

Distinguished
Dec 26, 2012
489
86
19,090
62
So if I understood this correctly, it's like turning on DSR and DLSS at the same time, but now somehow in every game? I suppose the reason it works is that the image quality can't possibly be worse than native when it's downsampled, but I still doubt it will be noticeably better than NIS + DSR. Probably the main thing it has going for it is the ease of setup, because NIS alone is a lot of work to get set up. I'm not familiar with it, but maybe it doesn't even support resolutions above native.

For those wondering, combining RSR and VSR does the same thing for AMD, although obviously the tech is different.
 
Sounds good in theory but in reality Nvidia is selling the chips that make G-sync happen. So when they have a product performance advantage with their video card that is going to sell the card regardless only supporting g-sync will seal the deal for the monitor sale that has g-sync. Supporting an open standard only puts them in a spot where they have to pay additional licensing fees.
And G-Sync is sold as a premium product, it always has been. The other thing with G-Sync is it still offers objectively better results than most FreeSync monitors. For example, the Samsung Odyssey G7, one of the top rated gaming monitors, only works down to 60 FPS with FreeSync. And this seems to be the lower limit for a lot of high refresh rate FreeSync monitors. ASUS's PG279Q, which was also a top rated gaming monitor, has G-Sync and works down to 1FPS.

And you might scoff at the idea of NVIDIA having to test FreeSync monitors so they can be labeled as G-Sync compatible, but from what I gathered from other places, cheap FreeSync monitors actually suck at doing their job.
 
Reactions: Metal Messiah.
So if I understood this correctly, it's like turning on DSR and DLSS at the same time, but now somehow in every game?
Yes, It should work in a similar manner, but I'm not sure if we have to enable DLSS quality settings separately in-game as well. I mean I think NVIDIA has this DSR option enabled via the control panel, so maybe it is enabled via the NVCP. I could be wrong though.
 

ASK THE COMMUNITY