News AMD's FSR 2.0 Delivers Again in God of War

It might be time for NVIDIA to consider leaving out tensor cores on future consumer GPUs then. There doesn't seem to be any real consumer need for them outside of DLSS.

Although I'm sure there are a lot of ML hobbyists who'd put up torches and pitchforks if this happened.
 
  • Like
Reactions: salgado18
It might be time for NVIDIA to consider leaving out tensor cores on future consumer GPUs then. There doesn't seem to be any real consumer need for them outside of DLSS.

Although I'm sure there are a lot of ML hobbyists who'd put up torches and pitchforks if this happened.

That's quite true, especially with core counts growing exponentially it seems. The more cores you have, the greater the performance enhancements FSR 2.0 should bring, theoretically.

But for now, DLSS I believe still has a performance advantage over FSR 2.0 on some RTX GPUs and in some games.
 

KananX

Prominent
BANNED
Apr 11, 2022
615
139
590
It might be time for NVIDIA to consider leaving out tensor cores on future consumer GPUs then. There doesn't seem to be any real consumer need for them outside of DLSS.

Although I'm sure there are a lot of ML hobbyists who'd put up torches and pitchforks if this happened.
I would agree if not for the creator features that also use tensor cores, and then RT denoising uses tensor cores as well, so they are quite needed, not just for dlss.
 
  • Like
Reactions: KyaraM
I would agree if not for the creator features that also use tensor cores, and then RT denoising uses tensor cores as well, so they are quite needed, not just for dlss.
Denoising is a subset of the operation of "filling in the blanks." If we have a system to do this such that tensor cores do not provide a significant performance benefit, then there's less of a need for such.

AMD provided performance metrics with ray tracing enabled, so FSR 2.0 is clearly helping in denoising.
 

KananX

Prominent
BANNED
Apr 11, 2022
615
139
590
Denoising is a subset of the operation of "filling in the blanks." If we have a system to do this such that tensor cores do not provide a significant performance benefit, then there's less of a need for such.

AMD provided performance metrics with ray tracing enabled, so FSR 2.0 is clearly helping in denoising.
Denoising of RT for Radeon cards is done via shaders, it has nothing to do with FSR or any other software.
 
Denoising of RT for Radeon cards is done via shaders, it has nothing to do with FSR or any other software.
Well either way at the end of the day, unless someone has done a complete frame time profile of the hardware, we'll never figure out how much of an influence the tensor cores actually provides. And as far as I can tell, it's not clear there's still much of an advantage.

The only thing I can find is from NVIDIA themselves, but they labeled the time on tensor cores as DLSS working.
 

spongiemaster

Admirable
Dec 12, 2019
2,213
1,234
7,560
It might be time for NVIDIA to consider leaving out tensor cores on future consumer GPUs then. There doesn't seem to be any real consumer need for them outside of DLSS.

Although I'm sure there are a lot of ML hobbyists who'd put up torches and pitchforks if this happened.
The professional cards, formally known as Quadros, use the same dies as the RTX gaming cards. Nvidia isn't going to add a 3rd architecture for every generation. Tensor cores aren't going anywhere as long as there is demand for them from professionals who are willing to spend a whole lot more money for a GPU than gamers.
 
  • Like
Reactions: KyaraM

KananX

Prominent
BANNED
Apr 11, 2022
615
139
590
Well either way at the end of the day, unless someone has done a complete frame time profile of the hardware, we'll never figure out how much of an influence the tensor cores actually provides. And as far as I can tell, it's not clear there's still much of an advantage.

The only thing I can find is from NVIDIA themselves, but they labeled the time on tensor cores as DLSS working.
As long as DLSS needs tensor cores and it’s useful for work, it will be used in the gaming arch, unless they change DLSS fundamentally and introduce a gaming architecture without Tensor cores at the same time, which is unlikely since it’s not really needed.
 
  • Like
Reactions: KyaraM

hotaru251

Distinguished
FSR 2.0 still suffers 1 issue: to get "native" like quality w/ performance boost you need high end gpu & running 4k.

FSR 2.0 is w/e at 1080p (usually not worth it as quality gets hit hard)

1440p is game dependent usually.

4k is about only resolution that actually benefits w/o a quality loss.

as it needs a boat load of pixels to work...which 1080p lacks (hence why its so bad in comparison)


DLSS doesnt really have that issue and u can run weak hardware and low res and get usually betetr performance without losing much quality even at 1080p.


also tensor cores and stuff could be utilized by other parts of PC in future. (virus scans are example)
 
  • Like
Reactions: KyaraM

Sleepy_Hollowed

Honorable
Jan 1, 2017
409
155
10,870
FSR 2.0 still suffers 1 issue: to get "native" like quality w/ performance boost you need high end gpu & running 4k.

FSR 2.0 is w/e at 1080p (usually not worth it as quality gets hit hard)

1440p is game dependent usually.

4k is about only resolution that actually benefits w/o a quality loss.

as it needs a boat load of pixels to work...which 1080p lacks (hence why its so bad in comparison)


DLSS doesnt really have that issue and u can run weak hardware and low res and get usually betetr performance without losing much quality even at 1080p.


also tensor cores and stuff could be utilized by other parts of PC in future. (virus scans are example)

You mean to quickly encrypt devices:
 

mo_osk

Commendable
Nov 13, 2020
32
16
1,535
Well either way at the end of the day, unless someone has done a complete frame time profile of the hardware, we'll never figure out how much of an influence the tensor cores actually provides. And as far as I can tell, it's not clear there's still much of an advantage.

The only thing I can find is from NVIDIA themselves, but they labeled the time on tensor cores as DLSS working.


View: https://youtu.be/b4S9KBqYYVQ?t=593


Enabling RT on Radeon hardware is much heavier than on geforce rtx.

Although Radeon does have specialized units for RT, but obviously they're not on par with Nvidia's own RT core working together with the tensor core. And they also allows to use DLSS for another slight performances boost while preserving image quality. So while it's not clear how much they're doing I think that the tensor core are still worth having.
 
Last edited:
It might be time for NVIDIA to consider leaving out tensor cores on future consumer GPUs then. There doesn't seem to be any real consumer need for them outside of DLSS.

Although I'm sure there are a lot of ML hobbyists who'd put up torches and pitchforks if this happened.

things like tensor core becoming added feature for nvidia GPU. there are reasons why nvidia retain most of their compute feature starting from turing. not everyone have the money to buy professional grade GPU or sometimes those professional grade support/certified drivers are not needed for those on the semi pro segment. nvidia acknowledge this hence the thing like Studio Driver being pushed for geforce based card.
 

KananX

Prominent
BANNED
Apr 11, 2022
615
139
590
FSR 2.0 still suffers 1 issue: to get "native" like quality w/ performance boost you need high end gpu & running 4k.

FSR 2.0 is w/e at 1080p (usually not worth it as quality gets hit hard)

1440p is game dependent usually.

4k is about only resolution that actually benefits w/o a quality loss.

as it needs a boat load of pixels to work...which 1080p lacks (hence why its so bad in comparison)


DLSS doesnt really have that issue and u can run weak hardware and low res and get usually betetr performance without losing much quality even at 1080p.


also tensor cores and stuff could be utilized by other parts of PC in future. (virus scans are example)
Not really, maybe you should read this website, yes Toms, instead of just arguing for your Nvidia agenda. FSR 2.0 is very comparable to the latest DLSS.
I agree with the latter though, finally you have accepted that Tensor cores are useful.
 

hotaru251

Distinguished
finally you have accepted that Tensor cores are useful.
i think you have me mixed up with the other hotaru ._.

Not really, maybe you should read this website, yes Toms, instead of just arguing for your Nvidia agenda. FSR 2.0 is very comparable to the latest DLSS.
i read/watch multiple sources. The consensus is for FSR 2.0 4k is about only res you gain w/o losing & 1080p just lacks the pixels so u gain little but lose a lot of quality (based on game some are betetr/worse)

That is why they focus so much on 4k in their (AMD) testing.

DLSS 2.0 "learned" from a lot more samples and thats why it doesnt suffer the same low res issue as FRS 2.0

There ARE limitations of what FRS can do w/o a ton of pixels to work with.

both are fine at 4k res but dlss 2.0 DOES better at 1080p & at 1440p on average.
 

KananX

Prominent
BANNED
Apr 11, 2022
615
139
590
i think you have me mixed up with the other hotaru ._.


i read/watch multiple sources. The consensus is for FSR 2.0 4k is about only res you gain w/o losing & 1080p just lacks the pixels so u gain little but lose a lot of quality (based on game some are betetr/worse)

That is why they focus so much on 4k in their (AMD) testing.

DLSS 2.0 "learned" from a lot more samples and thats why it doesnt suffer the same low res issue as FRS 2.0

There ARE limitations of what FRS can do w/o a ton of pixels to work with.

both are fine at 4k res but dlss 2.0 DOES better at 1080p & at 1440p on average.
Not true, FSR 2.0 works similarly to DLSS 2.3 in any resolution, the differences are minimal. I don’t think you read any sources or your “sources” aren’t good. It rather seems you’re going for a “Nvidia is better” agenda. We’re not talking about FSR 1.0 here, what you say doesn’t make any sense.