News AMD Debuts FidelityFX Super Resolution to Take on DLSS at Computex

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
The temporal component I would argue is necessary because previous frames can be used to reconstruct details of the next frame: https://en.wikipedia.org/wiki/Super-resolution_imaging


NVIDIA won't make their technologies open. If anything, because FSR is open, AMD is indirectly helping NVIDIA out and it gives them two checkboxes to put on their marketing materials. Plus NVIDIA may make FSR work even better on their GPUs if they can find a way to make it run on the tensor cores rather than the shader cores.

While I'll agree with you, it's not anti aliasing as originally quoted. It's Interframe comparison with vector analysis for movement.
 
While I'll agree with you, it's not anti aliasing as originally quoted. It's Interframe comparison with vector analysis for movement.
The mentioning of temporal anti-aliasing is that, and I may be pulling this out of my butt, DLSS 2.0 requires the game engine to support temporal anti-aliasing. If it already has the mechanisms for recalling history, then it's easier to integrate something that uses it for reconstruction.

Besides that, a lot of TAA implementations are kind of a hot mess anyway.
 
  • Like
Reactions: digitalgriffin
It's just a classic open source solution that is arguably worse vs a proprietary solution which is arguably better. Then in practice they're about the same and the open source version gets adopted more widely because it's free.
Except that both of these options are free for developers and end users. If anything, DLSS will be "more free" for developers than FSR because it is known that Nvidia helps developers implementing their features including DLSS while AMD doesn't have nearly the resources to offer similar assistance on the scale that Nvidia does. So not only will DLSS be easier to utilize with help from Nvidia, it could be cheaper due to Nvidia's donated developer assistance. NVidia is all in on DLSS. They're pushing into VR right now, a full range of RTX capable mobile products. If not for the screwed up dGPU market, we'd probably have a top to bottom desktop RTX stack right now. Even without that, Nvidia appears to be well above 80% market share for the dGPU market right now that AMD has more or less abandoned, by their own admission, for greener pastures. If you're a developer what is the benefit of doing all the work yourself implementing FSR just to end up with inferior results?
 
Ok, I can agree on that. Maybe they won't abandon DLSS and that's not a problem, it's maybe even better for us customers to have more options, but for me one thing is sure: either nvidia is forced to abandon DLSS (at some point in the future) or make it even better, accelerate it's progress even more than if they were alone with this tech. So the competition just got more fierce. Win-win for gamers.

P.S. What if because of FSR, nvidia decides to make DLSS open too? That's a crazy hilarious thought. I don't think it will happen at all, but funny nonetheless.

they can open the base tech but to create the AI algo best suited for your hardware that will be the biggest hurdle. that's why since the very beginning AMD try to compete with nvidia DLSS without using any kind of AI.
 
We have a sample size of one right now in Godfall and we really don't know much else about how FSR works so let's wait and see. If I was a gambling man, I would wager that both solutions are probably pretty darn close and close enough to where your average gamer isn't going to care. It's just a classic open source solution that is arguably worse vs a proprietary solution which is arguably better. Then in practice they're about the same and the open source version gets adopted more widely because it's free.

just because it was open source meaning it will get adopted more. just look what happen with Bullet Physics vs nvidia PhysX. the tech might be free to use but the resource and effort needed to implement the said tech in games are not.
 
We have a sample size of one right now in Godfall and we really don't know much else about how FSR works so let's wait and see. If I was a gambling man, I would wager that both solutions are probably pretty darn close and close enough to where your average gamer isn't going to care. It's just a classic open source solution that is arguably worse vs a proprietary solution which is arguably better. Then in practice they're about the same and the open source version gets adopted more widely because it's free.

If it's somewhere in the ballpark of being close to as good as DLSS 2.0, it'll probably be similar to the whole G-sync vs Freesync debate in the sense that the differences will be glossed over (even unintentionally) and there'll be a bunch of people who are convinced that DLSS 2.0 is hopelessly overly complex and exclusive and needs to go.

Example: When using a monitor with an actual G-sync chip (not just a 'G-sync compatible' monitor) there is a distinct difference between the two technologies in that the G-sync window goes all the way down to 30 fps, whereas with Freesync if you drop below 48 fps, you are outside the Freesync refresh rate window. The thing is, most people interested in a VRR monitor don't understand this difference and buy a Freesync monitor because it's 'G-sync compatible', it's cheaper, and everyone says that the two technologies do basically the same thing.

I personally don't mind paying a bit more to get the best overall experience, so I enjoy my G-Sync Ultimate monitor and love it when DLSS 2.0 gets implemented in a game (Red Dead Redemption 2 just announced DLSS 2.0 support! Hopefully they'll put it in MSFS 2020 one of these days.) But if history is anything to go by, so long as AMD's super resolution is somewhat competitive with DLSS 2.0, it'll probably receive widespread adoption and be considered the best bet by most people because it doesn't require an RTX GPU to do its thing. People are more likely to get behind something easily attainable that works good than something more difficult to attain that's great.
 
Radeon_page-0019-scaled.jpg


This is quality mode, not ultra quality mode. Still, this is more blurry than DLSS 1.0. Anyone who trashed DLSS 1.0 for being a blurry mess, which it was, better declare the same for this result. If these are typical results, FSR will be just as unusable below ultra quality mode as people claimed DLSS 1.0 was.
 
Last edited:
Radeon_page-0019-scaled.jpg


This is quality mode, not ultra quality mode. Still, this is more blurry than DLSS 1.0. Any who trashed DLSS 1.0 for being a blurry mess, which it was, better declare the same for this result. If these are typical results, FSR will be just unusable below ultra quality mode as people claimed DLSS 1.0 was.
So you already know how Ultra looks? OK, how's 22 June, whether is fine? Did Bitcoin crash to 20k? Since you've been in the future...

It's so easy to have opinions that are not based no facts, here I can have one too: Ultra will look as good as DLSS 2.0. Ok?

I did the same thing you did...
 
Example: When using a monitor with an actual G-sync chip (not just a 'G-sync compatible' monitor) there is a distinct difference between the two technologies in that the G-sync window goes all the way down to 30 fps, whereas with Freesync if you drop below 48 fps, you are outside the Freesync refresh rate window. The thing is, most people interested in a VRR monitor don't understand this difference and buy a Freesync monitor because it's 'G-sync compatible', it's cheaper, and everyone says that the two technologies do basically the same thing.

So, you're unfamiliar with LFC (Low Framerate Compensation), then?

The example you cite about 48fps would only be the case for monitors whose maximum refresh is under 96Hz. And yes, while I've seen a bunch of monitors advertising a refresh rate of, say, 48-75, the more gaming-oriented monitors typically have higher maximum refresh rates.

TL;DR - if a FreeSync monitor has a max refresh rate of 2x or more of the minimum, then when the frame rate dips below the minimum, LFC can be used, and, the monitor will handle as low as 1/2 the minimum refresh rate just fine. So, in the case of a monitor with, say, a 48-100Hz FreeSync range, you can go down to 24fps just fine.

LFC works by, when going below the minimum FreeSync range, will make the refresh rate to be 2x the fps, and display each frame twice.
 
  • Like
Reactions: digitalgriffin
Radeon_page-0019-scaled.jpg


This is quality mode, not ultra quality mode. Still, this is more blurry than DLSS 1.0. Any who trashed DLSS 1.0 for being a blurry mess, which it was, better declare the same for this result. If these are typical results, FSR will be just unusable below ultra quality mode as people claimed DLSS 1.0 was.

Shame. I was worried this would happen.

If it were me and I was the coder, I would have rendered a key frame (full rez) at least twice a second. And then motion interpolate the data from the lower details frames similar to how mpg's work.
 
just because it was open source meaning it will get adopted more. just look what happen with Bullet Physics vs nvidia PhysX. the tech might be free to use but the resource and effort needed to implement the said tech in games are not.
I'm under the impression that anyone who seems to blindly believe that open is automagically better than proprietary has never actually worked with FOSS outside of using the executable it runs.