News AMD FidelityFX Super Resolution Image Quality Investigated

waltc3

Reputable
Aug 4, 2019
423
226
5,060
Why would you "think" it wouldn't look good?...;) Anyway, the really big deal about it is that it is far simpler and easier for developers to use in their games than DLSS2, and as the article mentions, it isn't proprietary. Since it covers a broad spectrum of GPUs and the quality is excellent, there doesn't seem much in the way of a decision for developers--sort of a no-brainer, imo, to use FSR.
 
Honestly I'm just wondering something: How come none of this is driver based yet?

That would be the killer feature.
You mean like Nvidia's sharpening feature? Yeah, that already exists. So does Radeon Image Sharpening, which I think can be used via drivers. But this is supposed to do a bit more than simply upscale and sharpen (maybe).
Why would you "think" it wouldn't look good?...;) Anyway, the really big deal about it is that it is far simpler and easier for developers to use in their games than DLSS2, and as the article mentions, it isn't proprietary. Since it covers a broad spectrum of GPUs and the quality is excellent, there doesn't seem much in the way of a decision for developers--sort of a no-brainer, imo, to use FSR.
Try not to spread bogus information. DLSS 2.0 integration shouldn't be any easier or more difficult than FSR integration. It's just putting in the links to a third party library, plus some UI stuff to enable selecting the various modes. Both should be a couple of days of effort for any competent developer.

But yes, DLSS is proprietary and FSR is 'open.' DLSS also tends to look better if you compare DLSS Quality to FSR Quality (Ultra Quality might be 'equivalent'), or DLSS Balanced to FSR Balanced, or Performance to Performance modes. The loss in image fidelity is very noticeable at the Balanced and Performance settings for FSR, less so for DLSS. Until we have a game that implements both, however, we can't say for certain how much faster FSR runs, or how much better DLSS looks. And unfortunately, given the nature of the business, I'd be pretty surprised to see many games implement both FSR and DLSS.
 
  • Like
Reactions: Memnarchon
You mean like Nvidia's sharpening feature? Yeah, that already exists. So does Radeon Image Sharpening, which I think can be used via drivers. But this is supposed to do a bit more than simply upscale and sharpen (maybe).

Try not to spread bogus information. DLSS 2.0 integration shouldn't be any easier or more difficult than FSR integration. It's just putting in the links to a third party library, plus some UI stuff to enable selecting the various modes. Both should be a couple of days of effort for any competent developer.

But yes, DLSS is proprietary and FSR is 'open.' DLSS also tends to look better if you compare DLSS Quality to FSR Quality (Ultra Quality might be 'equivalent'), or DLSS Balanced to FSR Balanced, or Performance to Performance modes. The loss in image fidelity is very noticeable at the Balanced and Performance settings for FSR, less so for DLSS. Until we have a game that implements both, however, we can't say for certain how much faster FSR runs, or how much better DLSS looks. And unfortunately, given the nature of the business, I'd be pretty surprised to see many games implement both FSR and DLSS.

Yeah I highly doubt AMD or Nvidia is going to allow direct comparisons, especially if they know it'll make their technology look bad. Overall I think FSR is a good start for AMD, since they are not able to leverage baked in hardware like Nvidia does, but producing effects that are broadly similar even if overall, they fall behind DLSS 2.0. The one thing that remains to be seen is if AMD takes ages to add supported games like Nvidia did or will they actually get a decent amount of games on board quickly, considering their current supported games are quite lackluster imo.
 

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
This is impressive for FSR 1.0. and that's a big win.

It's a much better effort from AMD with FSR 1.0 than nvidia with DLSS 1.0.

Also let's be real, the majority will use Ultra quality because it gives enough extra fps and very little or no image quality loss, depending on the game / scene / motion, etc.

Working on every modern GPU/APU and on AMD for PC + XSX + PS5 and Intel (DG2) adoption will make FSR much more popular and useful for an exponentially bigger number of people than DLSS is/will be.

I'm confident, unless nvidia aggressively throws money at the problem from now on, that as soon as in one year time, more games will support FSR than they will DLSS.
 
Why would you "think" it wouldn't look good?...;) Anyway, the really big deal about it is that it is far simpler and easier for developers to use in their games than DLSS2, and as the article mentions, it isn't proprietary. Since it covers a broad spectrum of GPUs and the quality is excellent, there doesn't seem much in the way of a decision for developers--sort of a no-brainer, imo, to use FSR.
3 main reasons:
1.- I'm not a fanboi.
2.- Upscaling images is really hard to do with just the image information at hand. nVidia knows this, that is why they made DLSS so complex and require neuronal training so they can dump that information into the drivers so the compatible games and be upscaled with that extra bit of info. So, AMD's algorithm is mighty impressive. You can try this by upscaling images, since it's, more or less, an equivalent test to do and see how you fare. That is why my expectations were low for this as a gen 1 with no pre-cooked information.
3.- I'm not a fanboi.

Regards,

EDIT: Typos.
 
Last edited:
  • Like
Reactions: TCA_ChinChin

aalkjsdflkj

Honorable
Jun 30, 2018
45
33
10,560
Maybe my eyesight is getting bad in my old age, but even the Performance settings look like they'd be perfectly adequate when there's movement and action happening. It's easy to spot differences between that level and Native/Ultra-Quality in screenshots, but I'm not really sure how much I'd care during gameplay. The big thing for me is when these technologies allow me to play games that I might not otherwise be able to play on my current hardware (1070) at reasonable framerates. If the difference is struggling to play at 40-50 FPS natively vs. actually playing with slightly fuzzy or soft graphics at 60-70 FPS, I'll definitely take 60-70 FPS. Since DLSS isn't available on my hardware I'm incredibly happy that FSR works on the hardware I actually own. Now it just needs to be supported by games I want to play...
 
Maybe my eyesight is getting bad in my old age, but even the Performance settings look like they'd be perfectly adequate when there's movement and action happening. It's easy to spot differences between that level and Native/Ultra-Quality in screenshots, but I'm not really sure how much I'd care during gameplay. The big thing for me is when these technologies allow me to play games that I might not otherwise be able to play on my current hardware (1070) at reasonable framerates. If the difference is struggling to play at 40-50 FPS natively vs. actually playing with slightly fuzzy or soft graphics at 60-70 FPS, I'll definitely take 60-70 FPS. Since DLSS isn't available on my hardware I'm incredibly happy that FSR works on the hardware I actually own. Now it just needs to be supported by games I want to play...
Here you go:

https://explore.amd.com/en/technologies/radeon-software-fidelityfx-super-resolution/survey

AMD seems to want to prioritize what Devs to approach, so why not give them a hand?

Regards,
 
You mean like Nvidia's sharpening feature? Yeah, that already exists. So does Radeon Image Sharpening, which I think can be used via drivers. But this is supposed to do a bit more than simply upscale and sharpen (maybe).
I meant more like when either DLSS or FFXSR will be integrated into the drivers. And from what I can gather, both DLSS and FFXSR work on the rendered frame with the only change in the input side being rendering resolution. It just feels kind of annoying when these features are touted as game changing, but the developer has to actually support it.

It's like when NVIDIA saw potential in DX11's Deferred Context and eventually decided to just add it as a driver wide thing that happens automagically.
 
I meant more like when either DLSS or FFXSR will be integrated into the drivers. And from what I can gather, both DLSS and FFXSR work on the rendered frame with the only change in the input side being rendering resolution. It just feels kind of annoying when these features are touted as game changing, but the developer has to actually support it.

It's like when NVIDIA saw potential in DX11's Deferred Context and eventually decided to just add it as a driver wide thing that happens automagically.
But that's the same as saying MSAA is (was?) not a game changer because Devs have to implement it (or their engines support it).

There's a few qualifiers for FSR to be more in line with what TXAA or MSAA do than what DLSS does, for sure. That's definitely a good thing and has to be taken in a positive light. I believe that's what AMD was going for. Or at least, I'd like to think they did.

Also, since it happens outside the driver, theoretically you can add it much like you could alter shaders and system calls for games via modding in case the developer doesn't officially support it, no?

Regards,
 
But that's the same as saying MSAA is (was?) not a game changer because Devs have to implement it (or their engines support it).
I'm not saying that DLSS or FFXSR are not gaming changing because it requires developers to actively support it. I'm saying it's annoying it's touted as game changing but AMD and NVIDIA don't seem to be doing anything to make it so developers don't have to actively support it.

EDIT: I should point out I only feel this way because DLSS and FFXSR don't affect the core rendering pipeline. So I don't feel the same way about say ray tracing because that is a change in the core rendering pipeline.

Also, since it happens outside the driver, theoretically you can add it much like you could alter shaders and system calls for games via modding in case the developer doesn't officially support it, no?
Whether or not the feature happens in a driver does not affect whether or not said feature can happen from things like ReShade. However, I would rather have it built into the drivers than have yet another thing to install
 
I'm not saying that DLSS or FFXSR are not gaming changing because it requires developers to actively support it. I'm saying it's annoying it's touted as game changing but AMD and NVIDIA don't seem to be doing anything to make it so developers don't have to actively support it.

EDIT: I should point out I only feel this way because DLSS and FFXSR don't affect the core rendering pipeline. So I don't feel the same way about say ray tracing because that is a change in the core rendering pipeline.
That's a strange way to look at it as, both DLSS and FSR do affect the rendering pipeline: they need to be inserted into it. If you run native resolution, you don't touch neither, unless I'm missing something here. Sure, it is different from implementing Ray Tracing or Tesselation or any other visual effect like lighting or shadows, but they're not comparable IMO. I can't say you can't feel conflicted, but I do believe you're mixing two things that shouldn't be?

Or to put it differently: FSR (at least, as described) is low overhead* for big gains in performance for developers to implement. While not technically an improvement in "graphical eye candy", it does bring something really important to the table, which is simple "performance". To repeat what I said above to another poster: I'm actually impressed with whatever algorithm implementation AMD went here, because it's actually quite the good upsampling technique they're using. You can go check* Hardware Unboxed and Gamer Nexus (sorry Toms!) in-depth analysis for more points of view, but I have to say the images do look good for just being a simple "upscaling" technique that delivers tangible gains in FPS. That's the angle you should look at it IMO.

Regards,
 

alithegreat

Distinguished
Aug 15, 2013
72
1
18,640
I am really surprised that everyone seems to forget what these technologies do in the end. Compromise image quality to get performance...

If we cannot discern image quality while moving, in action etc, then why it is there at the first place? Were all those shiny graphics drawn in vain?
These technologies should be a search for better image quality with less performance impact, not sacrifice image quality for playable performance.
If we cheer to much on this I fear we may push amd and nvidia on wrong direction. Imagine with some dlss 3 or fsr pro you always get 200 fps.. By reducing quality, removing some stuff from the scene? Will we be happy?
These shouldn't be the leading factor of gpus.
 
So if you're playing a game that comes in a bit short of 60 fps, DLSS and FSR can both get you into the fully smooth 60+ fps range. But if you're playing a game at 120 fps and you have a 240 Hz display, our experience is that DLSS won't generally scale that high—it becomes the limiting factor. On the other hand, FSR has no qualms about scaling to higher fps, and if you don't mind the loss of image quality, running in Performance mode often more than doubles performance. (So does running at 1080p instead of 4K.)
That's an interesting point. While the current iteration of DLSS may have an edge at more moderate frame rates, it's possible that FSR could provide an edge for high frame rate gaming, due to the lower overhead. The best way to compare these two techniques would be to eventually test games that support both, and try to roughly match frame rates at a given native resolution using various levels of DLSS and FSR, then compare image quality at whatever levels those end up being for a given game. Performing some testing on more mid-range, GPU limited graphics hardware might make sense too, especially for lower resolutions. Something like an RTX 2060, for example.

And unfortunately, given the nature of the business, I'd be pretty surprised to see many games implement both FSR and DLSS.
I'm not so sure. All it takes is a major game engine to integrate FSR support to make it even easier for developers to include in their games. There's no real reason why a game couldn't support multiple solutions, particularly since each have their own strengths and weaknesses depending on the hardware and settings being used. Aside from Nvidia probably encouraging them not to, to help push their newer cards, of course. DLSS isn't going to do much good for the majority of gamers that currently don't have compatible hardware to run it though. Going by the Steam Hardware Survey, the number of people with RTX cards currently only amount to around 17% of their userbase, so that remaining 83% is a pretty large market that could benefit from a better hardware-agnostic upscaling solution, especially since a lot of that will be lower-end hardware struggling to run newer games well at native resolution. Games often include the option for some form of upscaling, but its often not particularly good, so having a decent standard to work off of could be helpful.

I believe the new Unreal Engine 5 actually includes its own upscaling feature that makes use of temporal data though, so they might just stick with that as their upscaling technique if it happens to work better than FSR. UE5 also has official support for DLSS as well though, so it might also be worth comparing those against one another.

Honestly I'm just wondering something: How come none of this is driver based yet?

That would be the killer feature.
FSR and DLSS should be performing their upscaling before things like interface elements are drawn to the screen, allowing things like text to be rendered clearer and without artifacts at native resolution, as the developers have control over what it gets applied to. A universal, driver-based method would be applying the upscaling and sharpening after the final image is rendered, meaning those elements would be getting upscaled and sharpened too, which is less than ideal. And that's really something that's already largely covered by using the existing Radeon Image Sharpening or Nvidia's similar sharpening option combined with other forms of upscaling. It's possible that AMD will update Radeon Image Sharpening to behave more in line with the developer-integrated options though, even if it wouldn't work quite as well. Though unlike the built-in FSR feature, that also wouldn't be of much use to those without AMD hardware.

The one thing that remains to be seen is if AMD takes ages to add supported games like Nvidia did or will they actually get a decent amount of games on board quickly, considering their current supported games are quite lackluster imo.
In Nvidia's case, the original implementation of DLSS required AI training to be performed on a per-game basis, which likely required more time and effort to implement. The original DLSS was arguably worse than some existing forms of upscaling as well, and relatively few people had hardware capable of utilizing it, so there wasn't much incentive for developers to get on-board, aside from in Nvidia-sponsored titles. DLSS 2.0 has largely addressed most of those issues, though the number of systems capable of utilizing it is still in the minority. FSR has an advantage in that all graphics cards should be able to use it. Of course, if a game engine includes its own capable upscaling technique, like with the aforementioned UE5, games using it won't necessarily need to utilize FSR specifically to provide something similar or potentially better.

I am really surprised that everyone seems to forget what these technologies do in the end. Compromise image quality to get performance...

If we cannot discern image quality while moving, in action etc, then why it is there at the first place? Were all those shiny graphics drawn in vain?
These technologies should be a search for better image quality with less performance impact, not sacrifice image quality for playable performance.
If we cheer to much on this I fear we may push amd and nvidia on wrong direction. Imagine with some dlss 3 or fsr pro you always get 200 fps.. By reducing quality, removing some stuff from the scene? Will we be happy?
These shouldn't be the leading factor of gpus.
The thing is, most people barely notice much difference between games rendered at 4K compared to 1440p at common screen sizes, at least when not carefully analyzing still-frames. But 4K requires significantly more hardware resources for what ultimately only amounts to a slightly sharper image. So if you can render a game at around 1440p, but upscale it to 4K, while applying algorithms to make it look near-indistinguishable from a scene natively rendered at 4K, that opens up more hardware resources for making the game look better in other ways. For example, with raytraced lighting effects, or more detailed environments, things that are likely to provide more noticeable improvements to visuals than just a slight increase in sharpness. So it's not so much "compromising image quality", but rather shifting image quality from areas that matter less to those that matter more, allowing games to look better on a given level of hardware, not worse.
 

salgado18

Distinguished
Feb 12, 2007
931
375
19,370
I think the images are not in ascending order of quality, at least the first ones. I opened them in order, but had to reorder the tabs to get them right. Could you please verify that?
 

deesider

Honorable
Jun 15, 2017
298
135
10,890
I think the images are not in ascending order of quality, at least the first ones. I opened them in order, but had to reorder the tabs to get them right. Could you please verify that?
The order is deliberately mixed up to see if you can spot the differences and identify yourself which quality setting was used for each. The key to the image galleries is at the end of the article in the 'Initial Thoughts' section.
 

deesider

Honorable
Jun 15, 2017
298
135
10,890
The thing is, most people barely notice much difference between games rendered at 4K compared to 1440p at common screen sizes, at least when not carefully analyzing still-frames. But 4K requires significantly more hardware resources for what ultimately only amounts to a slightly sharper image. So if you can render a game at around 1440p, but upscale it to 4K, while applying algorithms to make it look near-indistinguishable from a scene natively rendered at 4K, that opens up more hardware resources for making the game look better in other ways. For example, with raytraced lighting effects, or more detailed environments, things that are likely to provide more noticeable improvements to visuals than just a slight increase in sharpness. So it's not so much "compromising image quality", but rather shifting image quality from areas that matter less to those that matter more, allowing games to look better on a given level of hardware, not worse.

The Epic Developer webcast the other week had some interesting info about UE5, where the Devs explained that for their new lighting solution (for global illumination), software raytracing is generally much faster than hardware raytracing. Maybe for reflections or other specific uses, hardware RT is better, but it seems the hardware tech is a long way off from being a real game changer.
 

escksu

Reputable
BANNED
Aug 8, 2019
878
354
5,260
I am really surprised that everyone seems to forget what these technologies do in the end. Compromise image quality to get performance...

If we cannot discern image quality while moving, in action etc, then why it is there at the first place? Were all those shiny graphics drawn in vain?
These technologies should be a search for better image quality with less performance impact, not sacrifice image quality for playable performance.
If we cheer to much on this I fear we may push amd and nvidia on wrong direction. Imagine with some dlss 3 or fsr pro you always get 200 fps.. By reducing quality, removing some stuff from the scene? Will we be happy?
These shouldn't be the leading factor of gpus.

Of course we will be happy... You need to remember that not everyone has the money to buy even cards like 3060ti/3070, let alone 3080/3090.

Reducing quality? How many people today have to put image quality settings to medium or low in their games because their gpu is not fast enough?
 

deesider

Honorable
Jun 15, 2017
298
135
10,890
Of course we will be happy... You need to remember that not everyone has the money to buy even cards like 3060ti/3070, let alone 3080/3090.

Reducing quality? How many people today have to put image quality settings to medium or low in their games because their gpu is not fast enough?
I totally agree. The OP has a werid way of looking at it.

Is this tech really going to be used to get 500 fps in Fortnite, or is it so people with a older PC can play something like RDR2 or Cyberpunk at a decent framerate without going to potato quality settings...
 

Skrybe

Prominent
Jun 22, 2021
35
4
545
I'd love to see a couple more comparisons if there is an in-depth review. Taking 4k as an example the performance option rescales 1920x1080. I'd like to see high quality photos (not screenshots) of games comparing 1920x1080 on a 4k monitor to native 4k and FSR Performance at 4k. And of course framerates for comparison.

That way we could see what the performance and quality differences are between just letting the monitor's scaler do the work and letting FSR scale it. I'm wondering whether running 1920x1080 and maxing out other quality settings would be better than FSR scaling to 4k but running lower quality settings.

I have an RX6900XT that I could test myself but I don't have any games that support FSR yet.
 

BeedooX

Reputable
Apr 27, 2020
70
51
4,620
Honestly I'm just wondering something: How come none of this is driver based yet?

That would be the killer feature.
I think with AMDs implementation, it's not just driver based because the game renders the game 'content' up-scaled, then allows the HUD interface to be rendered at native resolution. A multi-step process.
 

Joseph_138

Distinguished
Honestly I'm just wondering something: How come none of this is driver based yet?

That would be the killer feature.

Because if it was driver based, nVidia would have to support it in their driver updates, too, and that's not likely to happen. Having it game based means everyone can enjoy it regardless of their video card. Even Intel Iris can use it, theoretically.