News AMD Debuts FidelityFX Super Resolution to Take on DLSS at Computex

This is the best news this year, so far! GJ AMD!

As a (still forced by stock and prices) GTX 1080 user, I'm very excited about this. I also can't wait to get a new AMD GPU when prices come to more sane levels.

Nvidia has disappointed me enough times. With this tech working on GTX cards I can hold on till I can buy new. This is so exciting :)
 
  • Like
Reactions: salgado18
According to AnandTech, their method only works on just the frame, so it's about the same as DLSS 1.0. When they integrate it with temporal anti-alaising, then it might have a shot at DLSS 2.0.
That's the only thing I have to see in (high rez) benchmarks / tests, how good or bad do those presets look, but AMD said is not a big difference on Ultra and frankly, even Ultra gives enough performance for me.

Also all the leaks so far said it's like a DLSS 1.5, not 1.0. So yeah I'll wait for those confirmations, but even if they are behind nvidia at this point, the rate of adoption and advancement this tech will have since it's open source will surpass DLSS 2.1 fast. I think nvidia soon will have to just embrace this tech and forget about DLSS (just my opinion).

edit: AMD uploaded the 4k video of FSR
View: https://www.youtube.com/watch?v=eHPmkJzwOFc


Looks like it's indeed good enough and better than DLSS 1.0.
 
Last edited:
That's the only thing I have to see in (high rez) benchmarks / tests, how good or bad do those presets look, but AMD said is not a big difference on Ultra and frankly, even Ultra gives enough performance for me.

Also all the leaks so far said it's like a DLSS 1.5, not 1.0. So yeah I'll wait for those confirmations, but even if they are behind nvidia at this point, the rate of adoption and advancement this tech will have since it's open source will surpass DLSS 2.1 fast. I think nvidia soon will have to just embrace this tech and forget about DLSS (just my opinion).

AMDs solution is FAR inferior:
  • AMD uses only the current frame
  • AMD uses no advanced information like motion vectors or depth masks
  • AMD occupies the normal rasterization hardware (so you have LESS raw rasterization power when it is turned on)

With this knowledge it should be clear that AMDs solution will look much worse then current DLSS implementations. And because it occupies the normal rendering hardware instead of dedicated hardware it will most probably run worse then DLSS at the same internal rendering resolution.


And don't get me wrong: I would love for this to be a good alternative. But AMD has always been really bad with software - and knowing how their implementation seems to work ... I have little hope that it is anything more then a simple image-upscaler.
 
AMDs solution is FAR inferior:
  • AMD uses only the current frame
  • AMD uses no advanced information like motion vectors or depth masks
  • AMD occupies the normal rasterization hardware (so you have LESS raw rasterization power when it is turned on)
With this knowledge it should be clear that AMDs solution will look much worse then current DLSS implementations. And because it occupies the normal rendering hardware instead of dedicated hardware it will most probably run worse then DLSS at the same internal rendering resolution.


And don't get me wrong: I would love for this to be a good alternative. But AMD has always been really bad with software - and knowing how their implementation seems to work ... I have little hope that it is anything more then a simple image-upscaler.
Did you look at the 4k video? Only lower resolutions look worse, which look worse on DLSS too, no exception or difference there.

Also, this is now, the tech will progress much faster and the adoption much wider than DLSS. 3rd party reviews will confirm that it's good enough for what it brings, and does and how wide it works, even now at it's inception.
The future for gaming looks a little better now thanks to FSR, nvidia needs to figure out what they do next: invest even more money on their black box DLSS or adopt FSR, like they did with freesync... the history repeats.

Tom is right, it's a chess move by AMD and nvidia is in check:
View: https://www.youtube.com/watch?v=rMzjs85pUh4
 
AMDs solution is FAR inferior:
  • AMD uses only the current frame
  • AMD uses no advanced information like motion vectors or depth masks
  • AMD occupies the normal rasterization hardware (so you have LESS raw rasterization power when it is turned on)
With this knowledge it should be clear that AMDs solution will look much worse then current DLSS implementations. And because it occupies the normal rendering hardware instead of dedicated hardware it will most probably run worse then DLSS at the same internal rendering resolution.


And don't get me wrong: I would love for this to be a good alternative. But AMD has always been really bad with software - and knowing how their implementation seems to work ... I have little hope that it is anything more then a simple image-upscaler.

I will wait and see what this is like in reality and come to my own conclusions thank you on my 5700XT on the various different settings.

As for "how good this is compared to the nVidia solution", I will wait for reviews as I don't have a nVidia card to do direct comparisons on myself. Speculation at this point is largely moot as sometimes (not always) "inferior tech" can result in good quality and performance even though it is technically inferior in theory.

Remember: In theory there is no difference between theory and practice, in practice there is.
 
AMDs solution is FAR inferior:
  • AMD uses only the current frame
  • AMD uses no advanced information like motion vectors or depth masks
  • AMD occupies the normal rasterization hardware (so you have LESS raw rasterization power when it is turned on)
With this knowledge it should be clear that AMDs solution will look much worse then current DLSS implementations. And because it occupies the normal rendering hardware instead of dedicated hardware it will most probably run worse then DLSS at the same internal rendering resolution.


And don't get me wrong: I would love for this to be a good alternative. But AMD has always been really bad with software - and knowing how their implementation seems to work ... I have little hope that it is anything more then a simple image-upscaler.
Um, some of what you say is true, but...
Outright claims that it is "inferior" (compared to DLSS 1.0 in particular) can't be made until we have the tech in-hand for people to do independent tests.
Even if it "occupies the normal rasterization hardware", it clearly boosts the framerate so you can't really claim that the performance hit is important. What matters is the quality hit.

In the case of both FSR and DLSS, obviously rendering at native resolution would be ideal. These products exist to boost performance in the case where you can't achieve your performance goal running at native resolution without "tricks". Both involve a tradeoff (higher performance at lower quality). We don't yet have a good understanding kind of quality hit is imposed by FSR. We will in a few weeks when this launches for real. It may well turn out that FSR is "good enough" for many use cases even though they are using a different approach than DLSS. And the fact that it runs on NVIDIA GPUs is pretty cool, NVIDIA users will be able to take advantage of this in games that support FSR but not DLSS.
 
That's the only thing I have to see in (high rez) benchmarks / tests, how good or bad do those presets look, but AMD said is not a big difference on Ultra and frankly, even Ultra gives enough performance for me.

Also all the leaks so far said it's like a DLSS 1.5, not 1.0. So yeah I'll wait for those confirmations, but even if they are behind nvidia at this point, the rate of adoption and advancement this tech will have since it's open source will surpass DLSS 2.1 fast. I think nvidia soon will have to just embrace this tech and forget about DLSS (just my opinion).

edit: AMD uploaded the 4k video of FSR
View: https://www.youtube.com/watch?v=eHPmkJzwOFc


Looks like it's indeed good enough and better than DLSS 1.0.

some tech outlet also said something similar about DLSS before when AMD coming up with RIS. and when nvidia highlighting their freestyle and add the option to use sharpening filter directly from their own control panel some even said nvidia are finally giving up on DLSS. then nvidia come out with DLSS2. nvidia is not the type to abandon their tech immediately. not even after years of bad reception (just look how long nvidia keep TXAA around even when there is not favorable view of it since the very beginning). nvidia most likely keep improving their DLSS and working with major game engine to integrate their easy to use plugin.
 
....It may well turn out that FSR is "good enough" for many use cases even though they are using a different approach than DLSS. And the fact that it runs on NVIDIA GPUs is pretty cool, NVIDIA users will be able to take advantage of this in games that support FSR but not DLSS.

Great point...I'm not going to look the gift horse in the mouth without even seeing the gift. Lets hope AMD bats it out of the park with this one and everyone benefits.
 
  • Like
Reactions: Soaptrail and VforV
some tech outlet also said something similar about DLSS before when AMD coming up with RIS. and when nvidia highlighting their freestyle and add the option to use sharpening filter directly from their own control panel some even said nvidia are finally giving up on DLSS. then nvidia come out with DLSS2. nvidia is not the type to abandon their tech immediately. not even after years of bad reception (just look how long nvidia keep TXAA around even when there is not favorable view of it since the very beginning). nvidia most likely keep improving their DLSS and working with major game engine to integrate their easy to use plugin.
Ok, I can agree on that. Maybe they won't abandon DLSS and that's not a problem, it's maybe even better for us customers to have more options, but for me one thing is sure: either nvidia is forced to abandon DLSS (at some point in the future) or make it even better, accelerate it's progress even more than if they were alone with this tech. So the competition just got more fierce. Win-win for gamers.

P.S. What if because of FSR, nvidia decides to make DLSS open too? That's a crazy hilarious thought. I don't think it will happen at all, but funny nonetheless.
 
Last edited:
According to AnandTech, their method only works on just the frame, so it's about the same as DLSS 1.0. When they integrate it with temporal anti-alaising, then it might have a shot at DLSS 2.0.

Temporal has a tendency to make things blurrier and has blooming effects/post glow when sudden light sources appear/disappear.

Temporal is actually a very old technology dating back the 80's. Back then it was called 3D Comb Filters on CRT TV's used to reduce static.
 
P.S. What if because of FSR, nvidia decides to make DLSS open too? That's a crazy hilarious thought. I don't think it will happen at all, but funny nonetheless.
You don't understand how DLSS works if you think any company would open source it. It doesn't make any sense from a financial perspective since NVidia is using their super computers to continually train and improve the baseline algorithms which obviously isn't free, and it doesn't make any sense from a practical perspective because neither AMD or Intel have separate tensor core equivalent hardware so the performance won't be comparable.
 
Temporal has a tendency to make things blurrier and has blooming effects/post glow when sudden light sources appear/disappear.

Temporal is actually a very old technology dating back the 80's. Back then it was called 3D Comb Filters on CRT TV's used to reduce static.
The temporal component I would argue is necessary because previous frames can be used to reconstruct details of the next frame: https://en.wikipedia.org/wiki/Super-resolution_imaging

Ok, I can agree on that. Maybe they won't abandon DLSS and that's not a problem, it's maybe even better for us customers to have more options, but for me one thing is sure: either nvidia is forced to abandon DLSS (at some point in the future) or make it even better, accelerate it's progress even more than if they were alone with this tech. So the competition just got more fierce. Win-win for gamers.

P.S. What if because of FSR, nvidia decides to make DLSS open too? That's a crazy hilarious thought. I don't think it will happen at all, but funny nonetheless.
NVIDIA won't make their technologies open. If anything, because FSR is open, AMD is indirectly helping NVIDIA out and it gives them two checkboxes to put on their marketing materials. Plus NVIDIA may make FSR work even better on their GPUs if they can find a way to make it run on the tensor cores rather than the shader cores.
 
Folks, rather than accusing others for being shills (no one is being paid by Nvidia, AMD, Intel, etc, by the way), let us sticks to the facts.

Use sources to substantiate claims. Attack ideas. Not people.

Thank you.
 
Proprietary components, like this G-Sync monitor, is already biting me. If FSR actually delivers, I'll be looking to RDNA3 next.
I've given up on Navi and Ampere - except if my 1080Ti happens to keel over before the next gen of cards comes around...
lol...wasn't your 1080Ti 'coughing' a bit, recently? 😀

It seems that it's just been bad news piled on bad news for those still going on several-generations-old GPUs.
Unfortunately, I'm suspecting this 'price shift' (due to crypto mining and fab shortages) to become permanent. If for no other reason than corporate greed.
 
You don't understand how DLSS works if you think any company would open source it. It doesn't make any sense from a financial perspective since NVidia is using their super computers to continually train and improve the baseline algorithms which obviously isn't free, and it doesn't make any sense from a practical perspective because neither AMD or Intel have separate tensor core equivalent hardware so the performance won't be comparable.
I actually agree, that's why I said "I don't think it will happen at all" - without going into details.

NVIDIA won't make their technologies open. If anything, because FSR is open, AMD is indirectly helping NVIDIA out and it gives them two checkboxes to put on their marketing materials. Plus NVIDIA may make FSR work even better on their GPUs if they can find a way to make it run on the tensor cores rather than the shader cores.
I agree, read above.
Also RDNA3 will have 1st hardware support for FSR, before Ampere next or whatever GPU from nvidia will have...
 
  • Like
Reactions: hotaru.hino
lol...wasn't your 1080Ti 'coughing' a bit, recently? 😀

It seems that it's just been bad news piled on bad news for those still going on several-generations-old GPUs.
Unfortunately, I'm suspecting this 'price shift' (due to crypto mining and fab shortages) to become permanent. If for no other reason than corporate greed.
That thing's been fine ever since I changed the psu. I even recently changed the Celsius S36 with a Celsius+ S28. Still not a peep from it.
No more 120mm fans and coolers for me. Once the adaptive voltage testing is done, I'm putting the D15S back on the cpu.

No lie, the volume of bad news is depressing.
I can only hope the pricing for the next gen to be ok.

Greed is one of the most destructive vices we as a sentient species possess. It ultimately harms everyone if a leash isn't put on it.
All things in moderation. Regulation/control is necessary, whether we like it or not, because some of us don't know when to stop until the damage is already done, or past the point of no return, and there's no choice but to adjust from that.
Can't expect our governments to bail us out of all our TARFUs.
 
  • Like
Reactions: alceryes
I agree, read above.
Also RDNA3 will have 1st hardware support for FSR, before Ampere next or whatever GPU from nvidia will have...
If anything to answer another part of your original question, I don't think NVIDIA will abandon DLSS work. I mean, there's still the holy grail: making it a driver wide option. Considering they did have a driver-wide temporal solution of MFAA, they could probably roll DLSS 2.0 driver wide. Though I heard MFAA has an issue working with DX12.
 
  • Like
Reactions: VforV
We have a sample size of one right now in Godfall and we really don't know much else about how FSR works so let's wait and see. If I was a gambling man, I would wager that both solutions are probably pretty darn close and close enough to where your average gamer isn't going to care. It's just a classic open source solution that is arguably worse vs a proprietary solution which is arguably better. Then in practice they're about the same and the open source version gets adopted more widely because it's free.
 
  • Like
Reactions: VforV