[SOLVED] Question about DLSS please.

U6b36ef

Distinguished
Dec 9, 2010
591
1
19,015
I understand the basic principle of DLSS (and AMD FSR).

However I can't work out one particular of what resolution game files that DLSS draws from.

E.g. if I am gaming on a 1440p monitor, and I use DLSS Quality. DLSS will render the frames at approximately 1080p , and scale them up to 1440p. (For now I am ignoring the exact resolutions and aspects, in that 1440p is not a direct upsizing of 1080p etc.)

My question and confusion is the following.

In this loose example, the image is rendered by DLSS at 1080p, (and later upscaled). Does that mean that the DLSS implementation uses the 1080p texture files? Or does it mean that DLSS uses the 1440p texture files and renders them at 1080p?


The reason I am asking is because some games and maybe all, use more detailed texture files at higher resolution. ... I keep thinking about this and getting confused.

If DLSS uses the 1080p texture files, then the image is rendered with 1080p textures at 1080 and upscaled. Whereas it would be better if DLSS rendered the 1440p texture files at 1080p and then upscaled. ... Otherwise you're effectively playing at 1080p, interpolated to 1440p. Hence missing the more advanced textures of 1440p.

Whereas of course someone using a 1440p monitor would likely be using a large screen, of maybe 27" or 32". Meaning if DLSS draws using 1080p textures, then the user gets 1080p textures on a large screen. .. Effectively leaving detail on the table.

I am probably making no sense though, so please forgive me if am way off.

Please can someone put me out of my misery, by explaining what happens? .... No matter how many times I googled this, I could not get the answer. Answers explained the basic principle.
 
Last edited:
Solution
In this loose example, the image is rendered by DLSS at 1080p, (and later upscaled). Does that mean that the DLSS implementation uses the 1080p texture files? Or does it mean that DLSS uses the 1440p texture files and renders them at 1080p?

The reason I am asking is because some games and maybe all, use more detailed texture files at higher resolution. ... I keep thinking about this and getting confused.
There are no "1080p texture files" or "1440p texture files." It loads in whatever quality you told it to use (with some exceptions, like Gears of War 5) regardless of the actual resolution.

If it looks better at a screen higher resolution, it's because, well, it's a higher resolution.

If DLSS uses the 1080p texture files...
In this loose example, the image is rendered by DLSS at 1080p, (and later upscaled). Does that mean that the DLSS implementation uses the 1080p texture files? Or does it mean that DLSS uses the 1440p texture files and renders them at 1080p?

The reason I am asking is because some games and maybe all, use more detailed texture files at higher resolution. ... I keep thinking about this and getting confused.
There are no "1080p texture files" or "1440p texture files." It loads in whatever quality you told it to use (with some exceptions, like Gears of War 5) regardless of the actual resolution.

If it looks better at a screen higher resolution, it's because, well, it's a higher resolution.

If DLSS uses the 1080p texture files, then the image is rendered with 1080p textures at 1080 and upscaled. Whereas it would be better if DLSS rendered the 1440p texture files at 1080p and then upscaled. ... Otherwise you're effectively playing at 1080p, interpolated to 1440p.

I am probably making no sense though.

Please can someone put me out of my misery, by explaining what happens? .... No matter how many times I googled this, I could not get the answer. Answers explained the basic principle.
DLSS is a post-processing effect. That is, it happens after the image has been rendered. So no, it doesn't affect the texture quality because as far as the rendering engine is concerned, it doesn't know what the final output really is.
 
Solution
There are no "1080p texture files" or "1440p texture files." It loads in whatever quality you told it to use (with some exceptions, like Gears of War 5) regardless of the actual resolution.

If it looks better at a screen higher resolution, it's because, well, it's a higher resolution.


DLSS is a post-processing effect. That is, it happens after the image has been rendered. So no, it doesn't affect the texture quality because as far as the rendering engine is concerned, it doesn't know what the final output really is.

Do you mean that the textures files are the same whatever the resolution? Then the resolution and pixel density are responsible for the image looking better in higher resolution.

I vaguely remember some screen-shots that showed extra details with higher resolutions. There were some minor details in objects that were simply not there in lower resolutions. As opposed to details just looking poorer at lower resolutions. It left me thinking that (some) games come with different textures at different resolutions. Or maybe I remembered it wrong, and (some) games have some overall different object details at different resolutions.

I thought there were just more textures in higher resolution settings, as such because higher resolutions mean more vRAM used for textures. That is probably why I think the way I do.
 
Do you mean that the textures files are the same whatever the resolution? Then the resolution and pixel density are responsible for the image looking better in higher resolution.

I vaguely remember some screen-shots that showed extra details with higher resolutions. There were some minor details in objects that were simply not there in lower resolutions. As opposed to details just looking poorer at lower resolutions. It left me thinking that (some) games come with different textures at different resolutions. Or maybe I remembered it wrong, and (some) games have some overall different object details at different resolutions.

I thought there were just more textures in higher resolution settings, as such because higher resolutions mean more vRAM used for textures. That is probably why I think the way I do.
Yes, whatever texture quality you asked the game to load, it'll load at that quality regardless of rendering resolution.

The reason why something looks better at higher resolution has to do with how the renderer has to find a way to translate what part of the texture you see onto the pixel of the final output. Since what the camera (from the monitor's perspective in game) sees is fixed regardless of resolution, information about the scene will have to be blended to fit the pixels of the render. Or basically, take a high resolution image with lots of fine, crisp detail, then resize it down to various resolution steps. You'll find those fine details will be lost the lower resolution you go.

However, rendering engines do use different resolutions of a texture called mipmapping, but this is mostly to do with how far the object is from the camera. There's no point in using a high resolution texture on an object that's so far away it'll look like a single pixel.
 
So I was looking at Wikipedia's entry on DLSS and it says that NVIDIA recommends adjusting the mipmap bias so that the rendering engine prefers higher resolution mipmaps than what it'd normally use. However this doesn't mean higher resolution textures than what you requested are used. It's just the rendering engine won't start using a lower resolution mipmap unless the point is further away than normal.