News AMD's FSR Uses Lanczos, Just Like Nvidia's 5-Years-Old Sharpening Filter

Here's the thing though, at least to me. Going by the Steam Hardware Survey 67% of users use 1920x1080 resolution, and GPUs have been capable of 1080p60 gaming for several years now, even the most popular card on the survey, the 5 year old GTX 1060, can do it with little to no detail compromises on most games.

By using any subsampling-upscaling filter or feature, be it DLSS, FSR, or an inbuilt subsampling feature which some games have, you're making a detail compromise, for FPS purposes. When nVidia and AMD were caught doing this in their drivers, I remember there was a massive outcry from all the reputable tech sites, including TomsHardware, at how underhanded it was, but now in 2021 it's a desired feature to cut quality in the name of FPS?

So to say FSR is a feature which benefits 100% of the userbase is a stretch. For laptops and IGPs certainly it's a benefit given the weaker power of them, and it's much akin to how this generation of consoles use dynamic resolution to maintain FPS, but for discrete card PC users...
 

vinay2070

Distinguished
Nov 27, 2011
294
85
18,870
Here's the thing though, at least to me. Going by the Steam Hardware Survey 67% of users use 1920x1080 resolution, and GPUs have been capable of 1080p60 gaming for several years now, even the most popular card on the survey, the 5 year old GTX 1060, can do it with little to no detail compromises on most games.

By using any subsampling-upscaling filter or feature, be it DLSS, FSR, or an inbuilt subsampling feature which some games have, you're making a detail compromise, for FPS purposes. When nVidia and AMD were caught doing this in their drivers, I remember there was a massive outcry from all the reputable tech sites, including TomsHardware, at how underhanded it was, but now in 2021 it's a desired feature to cut quality in the name of FPS?

So to say FSR is a feature which benefits 100% of the userbase is a stretch. For laptops and IGPs certainly it's a benefit given the weaker power of them, and it's much akin to how this generation of consoles use dynamic resolution to maintain FPS, but for discrete card PC users...
2 things.
  1. It is required because Raytracing got introduced.
  2. It makes sense because few years back games were mostly played on 1080 60Hz. Now 1440P high refresh rate and 4K are slowly becoming a norm. And upscaling a 720P to 1080 might have looked a lil crap back then, Upscaling a 1080 or 1440P to 4K dont look crap.
 
Here's the thing though, at least to me. Going by the Steam Hardware Survey 67% of users use 1920x1080 resolution, and GPUs have been capable of 1080p60 gaming for several years now, even the most popular card on the survey, the 5 year old GTX 1060, can do it with little to no detail compromises on most games.

By using any subsampling-upscaling filter or feature, be it DLSS, FSR, or an inbuilt subsampling feature which some games have, you're making a detail compromise, for FPS purposes. When nVidia and AMD were caught doing this in their drivers, I remember there was a massive outcry from all the reputable tech sites, including TomsHardware, at how underhanded it was, but now in 2021 it's a desired feature to cut quality in the name of FPS?

So to say FSR is a feature which benefits 100% of the userbase is a stretch. For laptops and IGPs certainly it's a benefit given the weaker power of them, and it's much akin to how this generation of consoles use dynamic resolution to maintain FPS, but for discrete card PC users...

As games advance a 1080p card where you could run maxed our 4 years ago (Say RX580) can no longer run maxed out. Details/Shaders and game engines get more complex all the time. I now consider a RX580 bottom of the barrel for 1080p gaming. Acceptable, but entry level
 

watzupken

Reputable
Mar 16, 2020
1,181
663
6,070
In my opinion, Nvidia could have done the same, but of course they will always choose the proprietary route to differentiate themselves from competition, and also to entice existing users to upgrade. I don't recall that Nvidia have in the recent years, released a new technology that is not proprietary. It is usually after a few years and under pressure will they make some technology more open.
 

watzupken

Reputable
Mar 16, 2020
1,181
663
6,070
Here's the thing though, at least to me. Going by the Steam Hardware Survey 67% of users use 1920x1080 resolution, and GPUs have been capable of 1080p60 gaming for several years now, even the most popular card on the survey, the 5 year old GTX 1060, can do it with little to no detail compromises on most games.

By using any subsampling-upscaling filter or feature, be it DLSS, FSR, or an inbuilt subsampling feature which some games have, you're making a detail compromise, for FPS purposes. When nVidia and AMD were caught doing this in their drivers, I remember there was a massive outcry from all the reputable tech sites, including TomsHardware, at how underhanded it was, but now in 2021 it's a desired feature to cut quality in the name of FPS?

So to say FSR is a feature which benefits 100% of the userbase is a stretch. For laptops and IGPs certainly it's a benefit given the weaker power of them, and it's much akin to how this generation of consoles use dynamic resolution to maintain FPS, but for discrete card PC users...
Steam survey is backward looking, and to be honest, recent game titles are getting increasingly taxing on the GPU. FSR basically opens up the possibility of you running the game at an acceptable or preferred framerate. In my opinion, I think it makes sense to use FSR when one is forced to lower resolution from 1080p straight down to 720p. FSR at ultra quality settings don't go down directly to 720p, so that helps preserve some details, and at the same time, smoothens out the jaggies that is very obvious at low resolution. So overall, I feel it is a win. Even if I were to match FSR using a 720p (to upscale to 1080p) vs native 720p, I believe the former is still going to be a better alternative. So no complains here. If you think that is cheating, then just don't use DLSS or FSR. I think the feature is there where you can toggle on or off, so when it comes to review, reviewers are transparent about whether DLSS or FSR is used. So I don't see any cheating in this case.
 
2 things.
  1. It is required because Raytracing got introduced.
  2. It makes sense because few years back games were mostly played on 1080 60Hz. Now 1440P high refresh rate and 4K are slowly becoming a norm. And upscaling a 720P to 1080 might have looked a lil crap back then, Upscaling a 1080 or 1440P to 4K dont look crap.

1) Ray tracing is still in its infancy and newer methods are being developed which are far more efficient. Remember the ReSTIR algorithm, and nVidia's brilliant demonstration video, last year that was 6-60x more efficient? It's still very much a brute force application.

2) But TomsHardware's own review states that the quality loss becomes noticeable at the "balanced" setting and only gets worse from there. To quote, "There's an often perceptible loss in image quality, especially if you go beyond the Ultra Quality mode", and, "FSR has no qualms about scaling to higher fps, and if you don't mind the loss of image quality, running in Performance mode often more than doubles performance. (So does running at 1080p instead of 4K.) "

Which goes back to my original argument: If you're already compromising on details and IQ by using FSR, why not just run it at a lower resolution, especially if you play an even fraction of the native resolution so the upscaling isn't fractional? Or better yet, why not cut out some of the details which may not amount to much of anything visually, especially compared to artifacts or blurry textures? All you're really losing then is the meaningless ability to say you're playing at X resolution on Y weak card.
 

watzupken

Reputable
Mar 16, 2020
1,181
663
6,070
Which goes back to my original argument: If you're already compromising on details and IQ by using FSR, why not just run it at a lower resolution, especially if you play an even fraction of the native resolution so the upscaling isn't fractional? Or better yet, why not cut out some of the details which may not amount to much of anything visually, especially compared to artifacts or blurry textures? All you're really losing then is the meaningless ability to say you're playing at X resolution on Y weak card.
The reason why these upscaling technologies are showing you 1080p instead of saying 720p is because the results in terms of picture quality is supposed to land somewhere in the middle, and ideally closer to the higher resolution IQ. As I mentioned, you have the option for a straight downgrade in resolution or graphical settings to boost performance. But the end results more than often looks a lot worst than running say DLSS or FSR at high quality settings. One immediate issue with downgrading from 1080p to a native 720p is the jaggies, which is not as bad when running FSR or DLSS.

Image quality loss at lower settings is not unexpected. The technology is basically trying to enhance whatever is on the screen, i.e. 720p or lower. Most reviews have concluded that one should not go lower than Quality settings, or ideally stick to Ultra Quality settings for FSR. In my opinion, at 1080p, there is very little reason to utilize anything less than the Quality settings. As resolution decreases, CPU bottleneck increases, so you are not going to see a meaningful improvement in FPS anyway, despite the image quality hit.
 
  • Like
Reactions: VforV
Steam survey is backward looking, and to be honest, recent game titles are getting increasingly taxing on the GPU. FSR basically opens up the possibility of you running the game at an acceptable or preferred framerate. In my opinion, I think it makes sense to use FSR when one is forced to lower resolution from 1080p straight down to 720p. FSR at ultra quality settings don't go down directly to 720p, so that helps preserve some details, and at the same time, smoothens out the jaggies that is very obvious at low resolution. So overall, I feel it is a win. Even if I were to match FSR using a 720p (to upscale to 1080p) vs native 720p, I believe the former is still going to be a better alternative. So no complains here. If you think that is cheating, then just don't use DLSS or FSR. I think the feature is there where you can toggle on or off, so when it comes to review, reviewers are transparent about whether DLSS or FSR is used. So I don't see any cheating in this case.

Nvidia Points Finger at AMD's Image Quality Cheat | Tom's Hardware (tomshardware.com)

That is to what I was referring. There was an incident before but that article has been removed due to age no doubt.

Personally I don't care what people do, they can run at 4k performance minimal details on a RTX 3090 for over 9000 FPS if that's what they want, all I'm saying is it doesn't make sense to me to play at a subsampled higher resolution with visual artifacts than at an aspect proportional lower resolution, especially if your goal is to just say you're playing at 4K on a weak card, since you can do that anyway with AMD's VSR.

And I say this as someone who before last year had a Fury Nano (which AMD unofficially abandoned long ago, so I abandoned them) on a brilliant non-gaming LG 4K display, and 1920x1080 on a 4K display with nothing but the monitor's native scaler looks very nice.
 
The reason why these upscaling technologies are showing you 1080p instead of saying 720p is because the results in terms of picture quality is supposed to land somewhere in the middle, and ideally closer to the higher resolution IQ. As I mentioned, you have the option for a straight downgrade in resolution or graphical settings to boost performance. But the end results more than often looks a lot worst than running say DLSS or FSR at high quality settings. One immediate issue with downgrading from 1080p to a native 720p is the jaggies, which is not as bad when running FSR or DLSS.

But you wouldn't drop from 1920x1080 to 1280x720, there's 1600×900, 1680×1050, and 1440×900 between those, and you'd start there using your monitor or driver settings to maintain aspect ratio scaling option so you don't get stretched garbage.
 
Nvidia Points Finger at AMD's Image Quality Cheat | Tom's Hardware (tomshardware.com)

That is to what I was referring. There was an incident before but that article has been removed due to age no doubt.
It's one thing when a company does less work without disclosing what's going on. If AMD or Nvidia did FSR-like upscaling but didn't tell anyone the GPU was really only rendering at 80% of the resolution or whatever, that's cheating. If it's a setting in a game that you can choose to use -- or not use -- that's completely different. A lot of games have resolution scaling now, but they tend to use temporal scaling rather than something like FSR. Epic claims it has a scaling algorithm that's "at least as good as FSR" or some such, I think, but it also supports FSR in Unreal Engine if devs want that. As long as a game has settings that can be turned on/off -- like variable rate shading -- I'm all for potential improvements in performance that have a negligible loss in image quality being an option.
 

setx

Distinguished
Dec 10, 2014
264
237
19,060
Disgustingly unprofessional article.

First of all, there is no "Lanczos resampling algorithm": it's just classical resampling algorithm with Lanczos kernel. (Personally I like BC-spline kernel more but there is nothing wrong with using Lanczos.) And shaming someone for using such classic algorithm as a base is the same as shaming for using addition, because it's even more "old technology".

Second, if they invented effective halo removal for it (pretty much the main visual problem of Lanczos resampling) – they absolutely deserve to claim it as their algorithm.

"while Nvidia put its engineering efforts to work creating DLSS, an AI-driven upscaling and enhancement algorithm" – Nvidia put much more effort in its marketing than development. First version had exactly zero innovation (using NN for image upscaling is not a new idea at all, probably at least a decade-old algorithm).
 
  • Like
Reactions: thisisaname

spongiemaster

Honorable
Dec 12, 2019
2,364
1,350
13,560
2 things.
  1. It is required because Raytracing got introduced.
  2. It makes sense because few years back games were mostly played on 1080 60Hz. Now 1440P high refresh rate and 4K are slowly becoming a norm. And upscaling a 720P to 1080 might have looked a lil crap back then, Upscaling a 1080 or 1440P to 4K dont look crap.
You're kind of missing what Alvar was saying. There's this constant droning that FSR is available to everyone and so everyone gets the benefits. But Alvar is pointing out that 2/3's of Steam users are still at 1080p. Taken a step further, about 85% of Steam users have a primary resolution of 1080 or lower. All of the most popular cards on Steam are perfectly capable of gaming at 1080 or lower, so there would be no reason to use any upscaling technology. If you're trying to game on a iGPU, then it would help at any resolution, but you're also not really a serious gamer if you're doing that.

1. The main reason Nvidia developed DLSS was to boost framerates for ray tracing. FSR is not going to make a card without hardware acceleration for raytracing capable of using ray tracing. So you're right back to requiring a current generation or 20 series RTX card, and have eliminated everyone else.

2. The 2 main 1440p resolutions (2560x1440 and 3440x1440) combine for 10% market share on steam. High refresh rate versions are likely a significant minority of that percentage. 1440p is no where near becoming the norm. 4k is at 2.26%. As mentioned above, only 15% of gamers on steam are above 1080p. If you've shelled out for that nice high end monitor, you shouldn't be driving it with a GTX 1060. The main benefactors of DLSS and FSR are going to be 4k users and raytracing gamers. For either of those groups, you're still going to need an RTX/RDNA 2 GPU.
 

vinay2070

Distinguished
Nov 27, 2011
294
85
18,870
You're kind of missing what Alvar was saying. There's this constant droning that FSR is available to everyone and so everyone gets the benefits. But Alvar is pointing out that 2/3's of Steam users are still at 1080p. Taken a step further, about 85% of Steam users have a primary resolution of 1080 or lower. All of the most popular cards on Steam are perfectly capable of gaming at 1080 or lower, so there would be no reason to use any upscaling technology. If you're trying to game on a iGPU, then it would help at any resolution, but you're also not really a serious gamer if you're doing that.

1. The main reason Nvidia developed DLSS was to boost framerates for ray tracing. FSR is not going to make a card without hardware acceleration for raytracing capable of using ray tracing. So you're right back to requiring a current generation or 20 series RTX card, and have eliminated everyone else.

2. The 2 main 1440p resolutions (2560x1440 and 3440x1440) combine for 10% market share on steam. High refresh rate versions are likely a significant minority of that percentage. 1440p is no where near becoming the norm. 4k is at 2.26%. As mentioned above, only 15% of gamers on steam are above 1080p. If you've shelled out for that nice high end monitor, you shouldn't be driving it with a GTX 1060. The main benefactors of DLSS and FSR are going to be 4k users and raytracing gamers. For either of those groups, you're still going to need an RTX/RDNA 2 GPU.

1) So you are saying if you have an AMD 6000 series GPU and enable RT in games and use FSR, you cant see a frame rate boost?

2) The main complain when RT got realeased was that games were not playable at even at 1080P or 1440p with RT Enabled on high end cards, forget 4K (however small that market is). However, just because steam survey shows most are on 1080P, does not mean manufacturers would aim more at that segment. With leaks on upcoming GPUs, they would be perfectly capable of 4K 100 FPS even on most demanding titles until you turn on RT probably. And thats the reason DLSS/FSR would be needed.

Also steam survey does not mean much, if people have just installed steam and not playing any games, or have installed steam to play some indie games they get for free to play on thier iGPUs and laptops. % does not matter in that case. what matters is how many mid to high end GPUs are nVidia and AMD are able to sell. To boost that mid-high end sales, they need to do these optimisations.
 
1) So you are saying if you have an AMD 6000 series GPU and enable RT in games and use FSR, you cant see a frame rate boost?

2) The main complain when RT got realeased was that games were not playable at even at 1080P or 1440p with RT Enabled on high end cards, forget 4K (however small that market is). However, just because steam survey shows most are on 1080P, does not mean manufacturers would aim more at that segment. With leaks on upcoming GPUs, they would be perfectly capable of 4K 100 FPS even on most demanding titles until you turn on RT probably. And thats the reason DLSS/FSR would be needed.

Also steam survey does not mean much, if people have just installed steam and not playing any games, or have installed steam to play some indie games they get for free to play on thier iGPUs and laptops. % does not matter in that case. what matters is how many mid to high end GPUs are nVidia and AMD are able to sell. To boost that mid-high end sales, they need to do these optimisations.
I strongly doubt most people buying anything in the RTX 2070 or above or RX 6700 XT or above are still stuck on a 1080p monitor. Oh, sure, some will be, but if you're buying a $500+ graphics card, there's a good chance you have at least a 1440p display.

Of course, the total market of people with RTX or RX 6000 GPUs, according to the Steam HW Survey, is about the same size as the percentage of people with 1440p or higher resolution displays. By my count, about 18% of all Steam HW respondents in July have a ray tracing capable GPU (12.66% RTX 20-series, 4.92% RTX 30-series, and 0.35% RX 6000-series). 13.24% of respondents have >1920x1080 resolution (not counting "other" which is 1.95%).
 
  • Like
Reactions: vinay2070

Alex/AT

Reputable
Aug 11, 2019
34
17
4,535
Imo, if you are going to get i.e. 4K to play, but can't play normally at 4K without using some sort of upscaling, it's all rather easy: you just don't need 4K.
 

spongiemaster

Honorable
Dec 12, 2019
2,364
1,350
13,560
1) So you are saying if you have an AMD 6000 series GPU and enable RT in games and use FSR, you cant see a frame rate boost?

2) The main complain when RT got realeased was that games were not playable at even at 1080P or 1440p with RT Enabled on high end cards, forget 4K (however small that market is). However, just because steam survey shows most are on 1080P, does not mean manufacturers would aim more at that segment. With leaks on upcoming GPUs, they would be perfectly capable of 4K 100 FPS even on most demanding titles until you turn on RT probably. And thats the reason DLSS/FSR would be needed.
No, I'm saying, if you have an RX580 or a Pascal card turning on FSR is not going to allow you to use ray tracing. You're right on #2, which goes back to my point, people are knocking DLSS because it requires an RTX enabled card to work, while FSR (technically) works on everything. However, in order for DLSS or FSR to be able boost ray tracing games to playable framerates, you still need a card capable of hardware acceleration of DXR so you're stuck needing an RTX card for DLSS or 6000 for FSR anyway. FSR's ability to work on any card is largely fool's gold, when the main benefactor of the technology will still need a mid to high end current gen card.
 

freedomnow

Reputable
Jan 22, 2019
3
0
4,510
Here's the thing though, at least to me. Going by the Steam Hardware Survey 67% of users use 1920x1080 resolution, and GPUs have been capable of 1080p60 gaming for several years now, even the most popular card on the survey, the 5 year old GTX 1060, can do it with little to no detail compromises on most games.

By using any subsampling-upscaling filter or feature, be it DLSS, FSR, or an inbuilt subsampling feature which some games have, you're making a detail compromise, for FPS purposes. When nVidia and AMD were caught doing this in their drivers, I remember there was a massive outcry from all the reputable tech sites, including TomsHardware, at how underhanded it was, but now in 2021 it's a desired feature to cut quality in the name of FPS?

So to say FSR is a feature which benefits 100% of the userbase is a stretch. For laptops and IGPs certainly it's a benefit given the weaker power of them, and it's much akin to how this generation of consoles use dynamic resolution to maintain FPS, but for discrete card PC users...

I agree FSR and VRS serve a purpose in which you selectively lose detail to improve performance. This is not new to graphics. LODs have been used from the beginning. I would like to differentiate DLSS from FSR as it actually creates better than native images in my opinion. I have seen cases specifically in the background where details in native resolution do not exist that magically appear in DLSS high quality mode. Every gamer is different and some like pretty pictures, some like performance. I'm on the side that likes the pretty pictures and whats cool about DLSS is that it is better than native in many instances. My hope is that as the networks improve over time it will continue to get better.

One thing that I feel is lacking in graphics is the "smoothness" factor. Why do consoles feel more smooth even though they are running at a fraction of the framerate? I know PC games cannot target a specific HW, to better tune the game for that HW. However I really wish framerate was a constant. Adaptive sync was suppose to solve this but I still fluctuate depending on whats going on.
 

jonathan1683

Distinguished
Jul 15, 2009
447
34
18,840
I strongly doubt most people buying anything in the RTX 2070 or above or RX 6700 XT or above are still stuck on a 1080p monitor. Oh, sure, some will be, but if you're buying a $500+ graphics card, there's a good chance you have at least a 1440p display.

Of course, the total market of people with RTX or RX 6000 GPUs, according to the Steam HW Survey, is about the same size as the percentage of people with 1440p or higher resolution displays. By my count, about 18% of all Steam HW respondents in July have a ray tracing capable GPU (12.66% RTX 20-series, 4.92% RTX 30-series, and 0.35% RX 6000-series). 13.24% of respondents have >1920x1080 resolution (not counting "other" which is 1.95%).


I have a 3070 and just purchased a 1080 monitor :) games like cyberpunk are very intensive as well 1080 still struggles in it.