News Starfield Perf Disparity Between AMD and NVIDIA GPUs Analyzed

Status
Not open for further replies.

NeoMorpheus

Reputable
Jun 8, 2021
223
251
4,960
As it stands now, we found the RX 7900 XTX matched and often exceeded the performance of Nvidia's RTX 4090

And of course, that cannot go unpunished!

I have to say, really interesting how much digital ink and hate has been spilled over this just because AMD sponsored this game.

But when its the other way around (The Ascent not having FSR or XESS support for example and one of way too many) and nothing much its said.

The outlet found that Starfield's pixel and compute shaders currently make better utilization of several aspects of AMD's RDNA 3 GPUs. These include RDNA 3's larger vector register files, superior thread tracking, and the use of an L0, L1, L2, and L3 cache design. Nvidia's RTX 40-series GPUs only have L1 and L2 caches, by comparison, and things are even worse on Nvidia's RTX 30- and 20-series GPUs that have significantly smaller L2 cache sizes.

Now that makes me think that all other games are simply not optimized at all for AMD, so lets rage about that!

We are in some really strange times when we are now demanding support for proprietary tech that limits our options.

Oh well.
 
Last edited:
Apr 5, 2023
7
4
15
And of course, that cannot go unpunished!

I have to say, really interesting how much digital ink and hate has been spilled over this just because AMD sponsored this game.

But when its the other way around (The Ascent not having FSR or XESS support for example and one of way too many) and nothing much its said.



Now that makes me think that all other games are simply not optimized at all for AMD, so lets rage about that!

We are in some really strange times when we are now demanding support for proprietary tech that limits our options.

Oh well.
Yes, that obviously has nothing to do with nobody caring or knowing about The Ascent, as it's surely as popular as Starfield from the indie developer Bethesda. The media is trying to hide something!!! The media even keeps trying to claim that FSR is a worthless technology, just because it is! The gall! The narrative!

Regards,
 
  • Like
Reactions: JarredWaltonGPU
D

Deleted member 2950210

Guest
I have to say, really interesting how much digital ink and hate has been spilled over this just because AMD sponsored this game.

And what exactly did that sponsorship get us?

A poorly optimised game, that has display issues even on AMD GPUs.

When the most powerful card of this generation comes second place to 7900 XTX, you KNOW there's something fundamentally wrong with the game.
 
Last edited by a moderator:

Dr3ams

Respectable
Sep 29, 2021
222
208
1,960
The media even keeps trying to claim that FSR is a worthless technology, just because it is!

Regards,
FSR is not worthless if it works...and it does work. Other upscalers may perform better, but that usually happens when a game has been optimized for a sponsoring hardware manufacturer.
 
  • Like
Reactions: prtskg
FSR is not worthless if it works...and it does work. Other upscalers may perform better, but that usually happens when a game has been optimized for a sponsoring hardware manufacturer.
FSR 1 isn't that great. It's basically a well-known upscaling algorithm (Lanczos), tweaked for edge detection and with a sharpening filter. It's super lightweight, but definitely doesn't provide "near native" quality by any stretch. FSR 2 significantly altered the algorithm and looks better, but with a higher performance impact. It still can't match DLSS 2 upscaling in overall quality, though at times it's at least close, particularly in quality mode when targeting 4K.

I believe Unreal Engine is working on its own internal temporal upscaling solution. It would basically be like FSR 2 in that it would work on any GPU, leveraging the GPU compute shaders to do the computations. I'd be super impressed if Epic took the extra step of having the algorithm leverage tensor / matrix hardware if available, but I seriously doubt that will happen. Whether it will look as good as FSR 2, or better, remains an open question.
 

evdjj3j

Honorable
Aug 4, 2017
336
352
11,060
FSR is not worthless if it works...and it does work. Other upscalers may perform better, but that usually happens when a game has been optimized for a sponsoring hardware manufacturer.
I spent about 15 hrs playing on FSR before downloading the DLSS mod and even unoptimized DLSS is leaps and bounds better looking than FSR. I just bought a new GPU and I was strongly considering buying a 7800XT until I played the game with a DLSS mod.
 

HyperMatrix

Distinguished
May 23, 2015
122
119
18,760
And of course, that cannot go unpunished!

I have to say, really interesting how much digital ink and hate has been spilled over this just because AMD sponsored this game.
Is this "hate" any different than the negative coverage Nvidia (rightfully) got for years for having massive levels of tessellation in sponsored games that resulted in unnecessary performance hits to AMD cards?

I would disagree disrespectfully. The Chips and Cheese did actual research, which Tom's is incapable to do for many years now, but you just state that their conclusion is wrong based on nothing.
You're missing the point. Chips and Cheese explained what's causing the disparity, but nothing about why those conditions exist. What Tom's Hardware is saying is that regardless of the reason, the performance is not acceptable. I would go even further and say that the game was intentionally designed around AMD architecture with little care given to how this would impact the other 80%+ of PC gamers.

This could make sense for Microsoft since these optimizations likely resulted in better performance on AMD powered Xbox consoles, but the problem is that when you see them skip out on adding something as simple as DLSS which could have been implemented in less than a day, you should start to wonder whether the game was intentionally left unoptimized for competing cards. Not to mention even the base game presets include turning FSR2.0 on in order to hit 60fps on many upper end AMD cards.

FSR is not worthless if it works...and it does work. Other upscalers may perform better, but that usually happens when a game has been optimized for a sponsoring hardware manufacturer.
I think FSR 2.0 (as well as XeSS) should be available as it does provide an upscaling solution to those that otherwise would have no other options. However, the quality of FSR2 even in this sponsored game leaves a lot to be desired. There is so much flickering and shimmering that occurs around edges and transparencies and even cloth moire. I have the DLSS mod installed so I can instantly switch between DLSS and FSR2 with a hotkey and while you can't capture the shimmering and flickering in a picture...it's very much visible to a big extent in video. It's a less than optimal solution and I personally wouldn't use it because of how distracting it is. But I'm sure there are some who can benefit from it to reach playable framerates and I'd be happy for the option to be available for them.
 

setx

Distinguished
Dec 10, 2014
229
158
18,760
I would go even further and say that the game was intentionally designed around AMD architecture with little care given to how this would impact the other 80%+ of PC gamers.
And what is wrong with that compared to many games optimized intentionally for nVidia? It actually makes sense if console port is important unlike hurting everyone but nVidia.

"Game should be better optimized" is fair request, "it shall run better on nVidia than on AMD" is not.
 

HyperMatrix

Distinguished
May 23, 2015
122
119
18,760
And what is wrong with that compared to many games optimized intentionally for nVidia? It actually makes sense if console port is important unlike hurting everyone but nVidia.

"Game should be better optimized" is fair request, "it shall run better on nVidia than on AMD" is not.

In terms of actual hardware, the 4090 is superior to the 7900xtx. Not even AMD denies that. Their own marketing material and pricing pits the 7900 XTX against the RTX 4080. This isn't about hurting anybody's feelings. It's just fact. And when you take money from AMD and optimize it in such a way that it underperforms with all other architectures...you're going to get criticized. AMD did the same thing with AC: Valhalla. If you're an AMD user you probably won't care and in fact you'll probably be happy. But when you consider that 80% of the market is non-AMD...you should expect this type of reaction from people.

This was a big budget AAA game, not from some random company that only has a console...but from Microsoft, which has been aggressively trying to promote gaming on windows. And while it's great to have AMD provide additional resources to add optimizations for their hardware, that shouldn't mean that the developer is not under any obligation to optimize for all hardware. Otherwise it's essentially a "pay us money, Intel & Nvidia, or the game is going to run poorly on your cards" situation.

Like I said an argument could have been made about it simply being an optimization issue and nothing more. But leaving out XeSS and DLSS which could have been implemented in less than a day, on top of all the other optimization issues, and saying Intel cards don't meet the requirements for the game and to buy a new video card, just shouldn't be behavior that anyone accepts. And it suggests other potential motives behind the "optimization" decisions as well, which is in line with what we've seen from other AMD sponsored titles.
 
  • Like
Reactions: JarredWaltonGPU

setx

Distinguished
Dec 10, 2014
229
158
18,760
In terms of actual hardware, the 4090 is superior to the 7900xtx. Not even AMD denies that.
It seems people like to forget all the times when AMD had more powerful hardware but nVidia still came first in games due to "optimizations".

But when you consider that 80% of the market is non-AMD...you should expect this type of reaction from people.
And nVidia got their market % by fair competition and superior hardware? Sure.

If you're an AMD user you probably won't care and in fact you'll probably be happy.
I have both 3090 and 6900 so pretty much as agnostic as possible. It's just fun to watch when nVidia is hit by what they were doing like forever. Also, the more such hits the better prices will be for next generation.
 
  • Like
Reactions: NeoMorpheus

steve15180

Distinguished
Dec 31, 2007
35
18
18,535
FSR 1 isn't that great. It's basically a well-known upscaling algorithm (Lanczos), tweaked for edge detection and with a sharpening filter. It's super lightweight, but definitely doesn't provide "near native" quality by any stretch. FSR 2 significantly altered the algorithm and looks better, but with a higher performance impact. It still can't match DLSS 2 upscaling in overall quality, though at times it's at least close, particularly in quality mode when targeting 4K.

I believe Unreal Engine is working on its own internal temporal upscaling solution. It would basically be like FSR 2 in that it would work on any GPU, leveraging the GPU compute shaders to do the computations. I'd be super impressed if Epic took the extra step of having the algorithm leverage tensor / matrix hardware if available, but I seriously doubt that will happen. Whether it will look as good as FSR 2, or better, remains an open question.

You're a tech journalist.....you should know better....Why bring up FSR1? Your remind everyone how rough FSR 1 looked, and then jump into a discussion abour FSR2 and DLSS2. If you're going to do that, why not also mention that DLSS1 had so many artifacts and problems many outlets said your better off not using it.

While we're at it Why is it a poor idea to make use of functions/features in AMD cards that are not present in Nvida cards, but perfectly fine to take advantage of such things that Nvidia have but not AMD? Because Nvidia is the leader? Because they have most of the market. Because they are THE STANDARD.

All such attitudes will get the marketplace is that no other cards will EVER be able to compete with Nvida cards. Which seems to be the attitude of most people with this game.
 
I think FSR 2.0 (as well as XeSS) should be available as it does provide an upscaling solution to those that otherwise would have no other options. However, the quality of FSR2 even in this sponsored game leaves a lot to be desired. There is so much flickering and shimmering that occurs around edges and transparencies and even cloth moire. I have the DLSS mod installed so I can instantly switch between DLSS and FSR2 with a hotkey and while you can't capture the shimmering and flickering in a picture...it's very much visible to a big extent in video. It's a less than optimal solution and I personally wouldn't use it because of how distracting it is. But I'm sure there are some who can benefit from it to reach playable framerates and I'd be happy for the option to be available for them.
Have you tried turning down the sharpening filter with FSR? The first thing I did while tweaking the settings was turn down the sharpening because it was causing an odd shininess on rocks. I've seen very little flickering or shimmering despite hearing a lot of people talking about it and having seen some in video.
 

P1nky

Distinguished
May 26, 2009
64
28
18,560
We would respectfully disagree.
You have big balls to contradict people who have a deep understanding of how GPUs work and who made extensive tests to come to this conclusion. All of this coming from a news writer without technical work. Arrogant balls.
 
Sep 16, 2023
1
0
10
It's more about coding culture than anything else. We've seen this many times in history. With very limited hardware resources the development process has to be very focused on optimizations and proper resource utilization. As the hardware grows, there is less need for optimizations to get the same performance. Adding various features to hardware allows development teams to skip some steps. I remember times, when it was worth tinkering with branch prediction to improve performance, now it barely matters because of the way branch prediction was improved. Similarly when PS3 was released, games were very poorly performing because of the complexity of the hardware but over time it was possible to utilize it all and gain much more performance from it. On the other hand, newer consoles use much more common and forgiving architecture that does not punish you for making shortcuts and keeping it simple. We may be getting into this area with graphics and games, especially from companies like Bethesda. It's quite clear that AMD is going into a direction of making more forgiving hardware, especially with so much cache. Less thinking, better performance if you didn't care about cache too much. Nvidia stays with more conservative approach so far, developers should know how to code for efficient GPU usage. For me it is just Bethesda's fault. It's as if C++ engineer would make a copy of the same data all over the place because he does not have time to think about references and you should just get more memory, it's not that expensive anymore. Yes, it is viable option to add more hardware and cut cost on development but it's not a good justification. Weather it's business decision or negligence, it does not look good. The same with DLSS, it's higher level example of this thing. XeSS and DLSS are optimized for specific hardware, FSR is generic feature, going with just generic one is either laziness or malicious decision.
We will see more of such problems in the future though. Especially if AMD will focus on more forgiving hardware design than NVIDIA. After all, the main purpose of GPUs was to be specialized, to do that particular job very well, the whole design of GPUs is about efficiency in particular tasks, not general purpose processing. Expecting that it will behave more like CPUs with much more forgiving and universal design is wrong approach. What next, games rendered on CPU when their power is enough for 30fps at 1080p?
 

salgado18

Distinguished
Feb 12, 2007
955
411
19,370
My only issue with the analysis is the lack of comparison with a control group. Ok, nice numbers and theory, but how does other games behave? How does an Nvidia-sponsored title uses the gpus? And a neutral game? Because we can't tell if a certain behavior (from cache hits, bandwidth utilization etc) is normal or not if we don't know what normal is.
 
  • Like
Reactions: JarredWaltonGPU
And here I thought that Bethesda announcing they'll add DLSS into the game would appease the raging hate... I guess the hate runs deeper.

And it's a great analysis. Too bad nVidia's uArch is not "optimal" for SF's engine, but that's life. Deal with it, just like AMD pundits have for the last 10 years with most games. Keep in mind that Unreal Engine games still favour nVidia horrendously, but no one bats an eye at it. And that is not "a small sample", but again, no one bats an eye. It's "expected" that nVidia performs better and if that expectation is not met, even when explained why, the rage pony rides.

Cynicism all around.

Regards.
 

Geezer760

Distinguished
Aug 29, 2009
219
108
18,870
Yeah, in other words while you drop your chips they give the cheese up the ying yang. it's all about them making more money, more money, for less product.
 
D

Deleted member 2950210

Guest
While we're at it Why is it a poor idea to make use of functions/features in AMD cards that are not present in Nvida cards, but perfectly fine to take advantage of such things that Nvidia have but not AMD? Because Nvidia is the leader? Because they have most of the market. Because they are THE STANDARD.

All such attitudes will get the marketplace is that no other cards will EVER be able to compete with Nvida cards. Which seems to be the attitude of most people with this game.

Other cards not being able to compete with Nvidia's, has nothing to do with Jarred Walton's attitude, or any other tech journalist's, for that matter.

It has everything to do with Nvidia being the better company overall.

You don't like that? Hey, i don't like it either. But consumers and journalists can do little to nothing to change this.

It's up to AMD and Intel to create some much-needed competition in the GPU department.
 
Last edited by a moderator:

rluker5

Distinguished
Jun 23, 2014
696
426
19,260
Right now we have graphics cards from 3 companies with different architectures. In the large majority of games the different architectures perform very proportionally comparably in the image quality they can put out to both weaker cards in the same arch family and to the various tiers from competing architectures. One could almost take a 80 tier, 70 or 50 tier card and expect a certain level of performance without knowing which company designed the GPU. (Intel is lagging a bit here, but have their market price reflecting that.)

For most games, for most visual aspects one could almost take the architecture as less important than added features so long as your card had a "big" enough chip of it's respective architecture to put out the fps you want at the settings you want. There is some deviance from game to game, but not a lot.

This game deviates significantly from the relative performance norm in how much it favors the RDNA arch. But it does not run well compared to the image quality as other games on my 6800 so I wouldn't call it well optimized for the RDNA family. It does run significantly worse on my 3080 and especially my A750 than other games though.

Because Starfield, with its best performing arch runs poorly, and with the others runs very poorly, and with all CPUs runs very poorly for the image it is putting out, I don't feel it is a deficiency in a GPU or CPU architecture that is at fault for this. It is the game's fault. It just looks bad relative to other games on the hardware it is running on. The smallest hit in performance for image quality is dealt to AMD GPUs, but it is still a hit.

That being said, I'm still enjoying the gameplay. It's similar to other Bethesda RPGs. Ugly game, but fun if you want some Bethesda style RPG action.
 

Pyrokinetic

Distinguished
Apr 30, 2012
23
2
18,525
A new report by Chips and Cheese digs into the low-level details of Starfield's game engine, analyzing shader run times, cache hit rates, and other aspects to try and figure out why there's such a large performance disparity between Nvidia and AMD GPUs.

Starfield Perf Disparity Between AMD and NVIDIA GPUs Analyzed : Read more
All this smacks of corporate shenanigans by AMD and Bethesda. AMD becomes the official sponsor for Starfield (early on) so can influence game development. AMD wants to sell GPUs and hit competitor Nvidia in the process. Hence the perfromance disparity optimized for AMD GPUs -- and -- high hardware requirements so avid fans will go buy newer -- AMD GPUs. Is sheer and blatant case of corporate collusion. Call it for what is is. Official modding tools for the game will be released once AMD gives the go ahead -- and that will happen when sales goals for 7000 series GPUs are met. You are welcome.
 
Status
Not open for further replies.