News AMD Claims Starfield Devs Have the Power to Add DLSS Support

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
FSR 2.2 is on par with DLSS 1.0, sure. But no one uses 1.0 anymore
They both are very blurry and experience ghosting

DLSS 2.0+ blows FSR out of the water with visuals that are better or on par with no DLSS/FSR

Anyone who has seen FSR in action and cares about visuals would rather have no FSR than FSR.

If you dont care about ghosting and a blurry mess, sure FSR working across all hardware is an advantage. But I dont want my games to look horrible, I want a refined experience which is why I only will ever play with DLSS

Actually, no, in comparison to DLSS 2.0+ FSR2.2 is pretty much on par.
Comparative videos literally show little to no difference between the two. The only time you will mostly find differences is if you stop playing the game and intently stare at several times zoomed in image.
Don't try to overplay DLSS.

Yes, DLSS 'technically' produces better results than FSR, but its mostly academic. Most people won't really notice the difference and won't really care.

Blurry mess? DLSS does generate it with 2.0+ versions... which also has ghosting and other problems that FSR is frequently accused of (FSR 2.2 addresses most of those issues anyway, hence why the differences are academic).

Also, DLSS 3.0 technically is 'better'... but as a lot of people mentioned, it makes 'fake frames' and it comes with its own caveats in terms of image quality, artifacting, etc ... take what you will from that... but do not downplay the fact it comes with its own problems.

I think DLSS die-hards need to tone down.
You don't really lose anything fundamental from using FSR... or at least, nothing you will mostly notice when actually playing games.
And if it bothers you THAT much... just play at native resolution without using FSR or DLSS (which your hw should be able to do anyway)
 
Last edited:
  • Like
Reactions: citral23
At the time when people accused AMD of paying Starfield devs to not incorporate DLSS, those were basically assumptions/rumors.

Why exactly should AMD waste its time addressing rumors and baseless assumptions from the general public?
If some well regarded media publisher came up with strong evidence that you were licking apples in the grocery store then putting them back, and the majority of the relevant news media agreed that you were and 3/4 of the polled public believed you were, and you weren't, and had a PR department that was looking for such things, wouldn't you make it clear that you weren't licking apples at the grocery store and putting them back?
NV jumped at the chance of saying 'we don't do it' as it benefits their PR... but in reality, they've done much worse in the past.
Some other corporation doing something bad for the consumer doesn't excuse all bad corporate behavior.
AMD has mostly a clean record in this regard because they usually support open source features that work across all HW.
Intel XeSS DP4a also works across all hardware and it also looks worse than XeSS with proper hardware acceleration. FSR2 is better than DP4a but worse than native XeSS and worse than DLSS.

But you know what else is open source? Temporal upscaling. Any game that uses FSR2 needs temporal upscaling to work and any game that has temporal upscaling supports DLSS and XeSS by definition. They just have to be implemented. How hard is that? I bet you could just make a phone call and either Nvidia or Intel would get you the help you needed if you were making a popular game.

AMD is not clean from the time they released their first CPU.
I don't make a habit of addressing every bit of assumption people make about me. Its time consuming and exhausting.
All that was done was tech publishers getting ahead of themselves and made a stupid assumption that AMD is blocking implementation of DLSS, when AMD as such has no power to do anything like that.
When a paid contract is signed it can have requirements. There was a sponsorship contract. If it was in the contract then AMD does have that power.

The easiest way for AMD to prove that they didn't make Starfield a worse performing game for everyone using RTX and Arc for noncompetitive reasons is to publish the original sponsorship contract text.
If AMD does and the restriction against the use of DLSS and XeSS is not in there, then all of those claiming it is are wrong aren't they? That would be proof AMD defenders are right on this.

But they won't do that because they have been licking apples and putting them back.
 
  • Like
Reactions: KyaraM
If some well regarded media publisher came up with strong evidence that you were licking apples in the grocery store then putting them back, and the majority of the relevant news media agreed that you were and 3/4 of the polled public believed you were, and you weren't, and had a PR department that was looking for such things, wouldn't you make it clear that you weren't licking apples at the grocery store and putting them back?

Some other corporation doing something bad for the consumer doesn't excuse all bad corporate behavior.

And I never said that one bad thing from a competitor excuses another from doing the same.
Point of the matter is, AMD hadn't done that.

Intel XeSS DP4a also works across all hardware and it also looks worse than XeSS with proper hardware acceleration. FSR2 is better than DP4a but worse than native XeSS and worse than DLSS.

The definition of 'worse' is highly subjective here because implementation of each feature will matter.
All three upscalers will come with their own advantages and drawbacks (bear in mind that ghosting, artifacting, shimmering, etc. DOES appear on XeSS and DLSS).
AMD did address most of these issues in FSR 2.2 (so its puzzling as to why Bethesda hadn't used it and instead went with 2.0.

At any rate, to really notice the differences, one would have to stop playing the game, zoom in and look for discrepancies in details.

Yes, technically speaking, FSR is 'worse', but the margin has been reduced to academic debate for the most part which most people won't even notice while they are actually playing.

Plus, one of the likely reasons why XeSS wasn't implemented (despite being open source) is likely because Intel's presence in the gaming sector is minuscule to virtually non-existent. Their entrance into this particular segment is fairly recent to start with, and adoption of XeSS will vary (plus, it definitely has its own drawbacks as we've seen in comparative videos).

But you know what else is open source? Temporal upscaling. Any game that uses FSR2 needs temporal upscaling to work and any game that has temporal upscaling supports DLSS and XeSS by definition. They just have to be implemented. How hard is that? I bet you could just make a phone call and either Nvidia or Intel would get you the help you needed if you were making a popular game.

I wouldn't bet on anything.
Usually, incorporation of features from certain companies is done when a company works with the game devs, or the company pays the game devs to include features into the game.
Ultimately, the decision to use any company features is up to the game devs... and whether they see it worth their time, money and effort.

AMD is not clean from the time they released their first CPU.

When a paid contract is signed it can have requirements. There was a sponsorship contract. If it was in the contract then AMD does have that power.
I said 'mostly' didn't I?

As for sponsorships... as I explained before, sponsorship only gives the company priority in terms of feature implementation and optimizations.
Whether or not competing companies features are used or not is entirely up to the devs.
Prohibitions of using competing company features is quite frankly a waste of resources and money.
The gaming market is pittance compared to the AI and data centers... quite honestly, it doesn't make sense for ANY hw company to bother itself with it.
If they sponsor a given game, they merely pay for priority in terms of feature implementation and optimizations over competing companies... that's it.
Anything else would would be considered anti-competitive behavior and opens the said company to potential lawsuits.

In this day and age, keeping something like that a secret would be next to impossible, and AMD being the 'underdog' with much less resources vs others, it doesn't make sense to do that (they're better off focusing their efforts in the AI and data center as far as adoption and optimizations are concerned).

The easiest way for AMD to prove that they didn't make Starfield a worse performing game for everyone using RTX and Arc for noncompetitive reasons is to publish the original sponsorship contract text.
If AMD does and the restriction against the use of DLSS and XeSS is not in there, then all of those claiming it is are wrong aren't they? That would be proof AMD defenders are right on this.

But they won't do that because they have been licking apples and putting them back.
Ideally perhaps, but this is likely not going to happen.
In that sense, you'd have to request the same of NV sponsored games too... and I don't see you advocating for that to happen.
 
AMD did address most of these issues in FSR 2.2 (so its puzzling as to why Bethesda hadn't used it and instead went with 2.0.
Lol no. FSR 2.2 is a blurry mess. Here is a good comparison screenshot showcasing just how HORRIBLE AMD is
 
  • Like
Reactions: KyaraM
Lol no. FSR 2.2 is a blurry mess. Here is a good comparison screenshot showcasing just how HORRIBLE AMD is

You forgot one crucial thing. That review is outdated and Jedi Survivor is/was an extremely poor example since its badly optimized game to start with.

You realize of course that implementation and optimization of FSR (or DLSS) matters, right?
You can't just slap FSR, XeSS or DLSS into a game and expect them to work flawlessly.

Devs need to spend time optimizing the implementation of the upscalers to solve bugs that arise with ANY upscaler (and don't kid yourself because the issues that can appear with FSR also appear with DLSS and XeSS).

Also, since that outdated review, numerous patches were released, and some mods were released which address the blurryness.
FSR by itself works fine when its properly implemented and optimized in the game... but hey, don't take my word for it if you don't want to... all I'm saying is this is just how software works (and trying to cherry pick by saying DLSS as an inherently/vastly superior feature to FSR would be overselling it - its better yes, but when both are properly implemented, the differences are mostly academic and not something people will notice).
 
DLSS and XeSS implementations are features just like RT, AA, AO.

Since AMD is the one sponsoring starfield game, their features and hw optimizations get priority over the competition.

Whether dlss and xess get integrated into starfield though is up to Bethesda and other factors.

For Xess, I wouldn't hold my breath because Intel's presence in the gaming sector is minuscule to non existent really and we do not know how much time or effort they would spend collaborating with Bethesda for Xess integration (which did demonstrate issues compared to FSR and DLSS).

Dlss has better chance of being integrated due to how widespread it is, but again it will depend on the game Devs to decide if it's worth their time and effort to use it or not.
Plus they still have to collaborate with Nvidia to optimize dlss integration into the game and reduce the occurrence of shimmering, artifacting and ghosting (all of which do appear with dlss and Xess).

As I said, one cannot just slap a feature into the game and expect it will work without issues. You need to optimise it to avoid problems. So how a feature is implemented matters... And that takes time, money and effort.
 
Last edited:
Pretty sure the DLSS crybabies all rock a 4090 and don't need it anyway because they are happy throwing 1K Watts at a game. Again, they are coming out as petty, claiming outrage because their favorite soccer team didn't get the treatment it "deserves" that rare time once in a blue moon.

Personally I see it as a progression, even if the supposedly "superior" tech is not in, the one that is used benefits _everyone_ What's more, the windows desktop is moving to cloud and soon nvidia gpus will be useless. The people who still want a discrete PC, will use linux, and that is growing at a rapid pace now, like it or not. AMD upstreams their drivers and nvidia will be left behind with their anti-consumer, anti-competition, anti everything behaviour.

Seen how much profit they make on AI chips? Yeah, you're getting also pegged if you buy their gpus.

Nvidia fanbois are short sighted and don't have a global overview of the market and community dynamics. It's like cheering for royalty, closed codecs while the whole industry and enthusiasts are moving towards open ones.

Don't be an adobe flash cultist.
 
Bedesha ofcourse save money by using FSR because it works in both Nvidia, AMD and Intel GPUs...
Seems like... "Nvidia if you want DLLS also... Pay as for it"

And I somewhat understand if that is the reason. People still have 1060, 1070, 1080, 1080ti GPU a lot! So if game company wants to support those users, the most sensible way is to make support FSR.
Aka their priority is to support as many customers as they can with as little effort as possible.
Bethesda most likely did not care if FSR can work on more hardware. FSR and DLSS can be modded into the game by user themselves. FSR vs DLSS is IHV rivalry issue not them. big publisher in general will not going to support tech from IHV based on which one will give them more benefit.
 
At the time when people accused AMD of paying Starfield devs to not incorporate DLSS, those were basically assumptions/rumors.

Why exactly should AMD waste its time addressing rumors and baseless assumptions from the general public?

NV jumped at the chance of saying 'we don't do it' as it benefits their PR... but in reality, they've done much worse in the past.
AMD has mostly a clean record in this regard because they usually support open source features that work across all HW.

I don't make a habit of addressing every bit of assumption people make about me. Its time consuming and exhausting.
All that was done was tech publishers getting ahead of themselves and made a stupid assumption that AMD is blocking implementation of DLSS, when AMD as such has no power to do anything like that.
supporting more open source stuff does not mean they are clean. they also do bad thing like nvidia in the past. the difference is nvidia see such thing is just how business work in reality. for AMD they go public about it said how such thing are done to hurt competition but at the same time they also engage in such thing.
 
  • Like
Reactions: Peter Ferrari
I'm going to play Starfield on an RX6800 and use FSR because that is what I have set up in my office and I really want ample desk space to use a joystick. And I'm really looking forward to it. Preloaded and everything.

But I still think it is crap that AMD seems to be locking out the DLSS on my 3080 and XeSS on my A750 in a lot of their sponsored titles. They both look better than FSR2.0 IMO and the A750 needs upscaling the most.

I like a lot of AMD products, but I'm still going to call them out for being anti consumer.
 
  • Like
Reactions: Peter Ferrari
Dlss has better chance of being integrated due to how widespread it is, but again it will depend on the game Devs to decide if it's worth their time and effort to use it or not.
game developer usually not the one that decides to use things like this. it is the publisher. and for the most part nvidia had the money to make publisher accept their tech to be included even in tittle that being sponsored by competitor
 
Lol no. FSR 2.2 is a blurry mess. Here is a good comparison screenshot showcasing just how HORRIBLE AMD is
but in jedi even if you set fsr at quality, it was still rendering at half resolution
 
Personally I see it as a progression, even if the supposedly "superior" tech is not in, the one that is used benefits _everyone_
A benefit would be an upscaler/reconstruction technioque that looks close to native, while it's faster. The only one atm is dlss. FSR is more like a fair trade off than a benefit, big performance gain for a big quality hit. dlss can look close/same/better than native more often than not.
ubU35MM.png


If you've got this coming from HUB, the most amd biased channel on yt, then fsr2 is not even close to being added value for anyone. (mind you, this is all done with default dll files, not updated to 2.5.1 via swapper). Not even one is similar. My experience with 6800 and 3080 is exactly the same. Dlss blows fsr2 to pieces in real-time image quality. Now with a new rr denoiser that's coming with 3.5, you'll see how amd's software solution is limited vs nvidia's hardware based one.
amd know that, they even put ML-specific hw acceleration on rdna3, but they'll be left unused cause their r/d team never thought of putting those on rdna2.

K6W6qd4.png

Now go cry about how HUB is nvidia biased all of a sudden....

That was fixed long ago and it's still blurry. Same with Farcry 6
spatial upscalers will always be blurry. Go use lanczos on a hq 1080p picture, and try to upscale it to 4k. it'll always look blurry, since it's not doing any actual detail reconstruction.
 
Last edited:
  • Like
Reactions: rluker5
DLSS is really only truly dominant when you drop down to lower quality modes where FSR has no chance. At quality settings DLSS is going to be a bit better or the same as FSR in the majority of cases (FSR also generally does better when there are more pixels available). Having watched the two most recent big roundup videos from HUB it seems like there's a correlation between how competently the game is done (graphically) and the FSR implementation.

The biggest problem facing FSR is that the GPU agnostic version will never be able to be better than DLSS. That means even if you have a good FSR implementation it'll be impossible to convince a chunk of people it's fine.

If you're going to use HUB you should pick the appropriate video rather than the one highlighting how bad most modern game TAA is:
odOSfEy.png


As for DLSS 3.5 we'll see how many games implement the denoiser as they have to be coded for it. I'd be surprised if anything multiplatform did so. It does seem like a significant leap forward for RT image quality though and I doubt AMD can implement anything like it for hardware agnostic FSR.
 
DLSS is really only truly dominant when you drop down to lower quality modes where FSR has no chance. At quality settings DLSS is going to be a bit better or the same as FSR in the majority of cases (FSR also generally does better when there are more pixels available). Having watched the two most recent big roundup videos from HUB it seems like there's a correlation between how competently the game is done (graphically) and the FSR implementation.

The biggest problem facing FSR is that the GPU agnostic version will never be able to be better than DLSS. That means even if you have a good FSR implementation it'll be impossible to convince a chunk of people it's fine.

If you're going to use HUB you should pick the appropriate video rather than the one highlighting how bad most modern game TAA is:
odOSfEy.png


As for DLSS 3.5 we'll see how many games implement the denoiser as they have to be coded for it. I'd be surprised if anything multiplatform did so. It does seem like a significant leap forward for RT image quality though and I doubt AMD can implement anything like it for hardware agnostic FSR.
it's enough to see how it does at 1440p quality, the most common resolution these days. most of us, enthusiasts, use cards for higher refresh gaming at 1440p rather than 4K/60, even if that's 4080/7900xtx tier. Seen as many 4090 owners use it for 1440/240 as they do for 4k/144. Plus at 4K fsr only ties 3 out of 24 too, and only without swapping the dlss version to 2.5.1. Dunno why HUB left the dlss version unchanged, there is even a tool for lazy people like me to swap them with a click. So as much as this looks like a landslide for dlss, it could have been even better.

my ss also included dlss vs fsr2 chart btw, so your accusations are not really valid.
 
it's enough to see how it does at 1440p quality, the most common resolution these days. most of us, enthusiasts, use cards for higher refresh gaming at 1440p rather than 4K/60, even if that's 4080/7900xtx tier. Seen as many 4090 owners use it for 1440/240 as they do for 4k/144. Plus at 4K fsr only ties 3 out of 24 too, and only without swapping the dlss version to 2.5.1. Dunno why HUB left the dlss version unchanged, there is even a tool for lazy people like me to swap them with a click. So as much as this looks like a landslide for dlss, it could have been even better.

my ss also included dlss vs fsr2 chart btw, so your accusations are not really valid.
Cherry picking data which best supports your stance is eyeroll at best when more data is readily available.
 
no singling out a resolution and comparing to native when a direct comparison between the two is available is
please, go over your comments again and see how stupid they are.

1. I did include dlss vs fsr
2. dlss crushes fsr in the one you posted too
3. if you have more doubts, hub is not the only tech site that says dlss is miles better. tpu says the same in their dlss vs fsr reviews they do for every game that has them.

the resolution I singled out is
1. the most common one for enthusiasts
2. it's my resolution, others don't interest me. if you are interested in 4k only, fine, but I can make the same point about cherry picking as you did for charts I pasted. If 1080p was in these charts, dlss would be even better in comparison to fsr than it is at 1440, but I wouldn't include 1080p since I or most of us do not play at 1080p.

see what both HUB and TPU think about dlss vs fsr comparisons and you'll know why some things are free and others are not. fsr isn't even the best software upscaler, tsr is miles ahead already. if starfield had tsr, I would not say a word about lack of dlss. fsr2 is a shimmer festival, that's why I'd rather have dlss or xess in a 60 eur game.
 
50fps at 1080p fsr is bad enough on 6700xt. another crappy port, congratulations to those who prepaid 60eur for this, you're all making gaming better for us.
 
  • Like
Reactions: KyaraM
Status
Not open for further replies.