News Starfield Perf Disparity Between AMD and NVIDIA GPUs Analyzed

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

hannibal

Distinguished
If game is optimised to use GPU X architecture. It wuns well at GPU X. Because GPU Y and GPU Z has different architecture that has different weak points and different strong points. You can not change that unles you remake the game to suite better for example GPU Z.
That is why we can see games where Nvidia or AMD or Intel works better than in some other game.

Just like Range Rover is faster than Ferrari in offroad, while Ferrari is faster than Range Rower in asphalt road.
And every car is slow att offroad. The Starfield engine is not made for speed… That one is sure. It may be even bad engine. But there is nothing wrong in that, that it is faster with AMD hardware. Makes sense for a console port.
So no chenigans in there. Just that game engine use different strong points than some other games… Bug and crashes… Well that is normal Bedesha… and that could be considered either a chenigan, poor skill, or normal thing for Bedesha… ;)
 
You have big balls to contradict people who have a deep understanding of how GPUs work and who made extensive tests to come to this conclusion. All of this coming from a news writer without technical work. Arrogant balls.
Actually, I edited the news post and put some additional comments in there. That statement was mine, and it was directly pointed at this completely ludicrous aspect of the C&C post:

"However, there's really nothing wrong with Nvidia's performance in this game, as some comments around the internet might suggest."

Nothing to see here, people! Nvidia's performance in this game is just fine! We ran some shader analysis, found that Nvidia's shaders are underutilized, and still concluded everything is as it should be!

Chips and Cheese missed the boat on this one, pure and simple. You want a more detailed commentary? Shader analysis only gets you so far if you don't have the source code and debugger to work with. These tools are specifically designed for the developers so that they can make better optimizations. So, when you see GPUs not being utilized very well, that tells the devs they can do better. Any other conclusion is rubbish. Every program out there has lots of untapped potential for optimizations, and it's purely a matter of how much time a company wants to invest versus the potential payoff. We live in a day where a lot of code is simply deemed "good enough" and left at that.

But hey, what would I know as a former software developer? It's not like there's any indication the game is underperforming on Nvidia GPUs other than every single benchmark I've seen. Chips and Cheese looked at three shaders, from a short snippet of run-time behavior. Who's to even say whether or not the segments they analyzed are representative or meaningful? I wonder what Bethesda is even working on with AMD, Intel, and Nvidia's driver teams, since it's all running so amazingly well!

The premise of the article was generally okay, but concluding that Nvidia performance was fine is stupid. Calling the whole of the internet that disagrees incorrect is just the icing on the cake. There are probably thousands of shaders in Starfield, and looking at three doesn't even scratch the surface. Coming to any sort of conclusion, especially when you don't even have access to the game code, is just silly.

This is the problem with DX12 code. You can actually build shaders that work better for one specific architecture, in this case it's clearly AMD's architectures — and note the plural there, because it's not just RDNA 3. It's also RDNA 2, RDNA, Vega, and Polaris. I tested them all. I found that competing Nvidia (and Intel) architectures all showed poor GPU utilization.

I'm not surprised. Bethesda is known for buggy games at launch, often with poor optimization. It couldn't do a game engine with vsync disabled properly until Fallout 76, and even then it required a patch or two before higher refresh rates and non-vsync didn't screw things up. Pushing Starfield out the door when it wasn't properly tuned for a lot of systems isn't even a unique situation. We've seen at least half a dozen major games that had the same thing happen in the past year. And nearly every single one of those games has received patches that in some cases dramatically improved performance.

I bet the devs on those games ran their own code analysis, as well as working with AMD, Intel, and Nvidia, and found more optimal ways of doing things. And that's why our conclusion in this piece is that we're relatively confident that Bethesda will make a lot of changes in the coming months that improve performance and GPU utilization, which runs contrary to the C&C conclusion that nothing is wrong.
 

setx

Distinguished
Dec 10, 2014
229
158
18,760
The premise of the article was generally okay, but concluding that Nvidia performance was fine is stupid. Calling the whole of the internet that disagrees incorrect is just the icing on the cake. There are probably thousands of shaders in Starfield, and looking at three doesn't even scratch the surface. Coming to any sort of conclusion, especially when you don't even have access to the game code, is just silly.
Their research might be flawed but you have absolutely no right to criticize them. All your words are empty words. If you want to prove them wrong – publish the better scientific research.
 

NeoMorpheus

Reputable
Jun 8, 2021
223
251
4,960
Their research might be flawed but you have absolutely no right to criticize them. All your words are empty words. If you want to prove them wrong – publish the better scientific research.
Well, he did. …
Just look at all the nvidia justifications and excuses and amd trashing done just on this thread comments alone.
 
In terms of actual hardware, the 4090 is superior to the 7900xtx. Not even AMD denies that. Their own marketing material and pricing pits the 7900 XTX against the RTX 4080. This isn't about hurting anybody's feelings. It's just fact. And when you take money from AMD and optimize it in such a way that it underperforms with all other architectures...you're going to get criticized. AMD did the same thing with AC: Valhalla. If you're an AMD user you probably won't care and in fact you'll probably be happy. But when you consider that 80% of the market is non-AMD...you should expect this type of reaction from people.

This was a big budget AAA game, not from some random company that only has a console...but from Microsoft, which has been aggressively trying to promote gaming on windows. And while it's great to have AMD provide additional resources to add optimizations for their hardware, that shouldn't mean that the developer is not under any obligation to optimize for all hardware. Otherwise it's essentially a "pay us money, Intel & Nvidia, or the game is going to run poorly on your cards" situation.

Like I said an argument could have been made about it simply being an optimization issue and nothing more. But leaving out XeSS and DLSS which could have been implemented in less than a day, on top of all the other optimization issues, and saying Intel cards don't meet the requirements for the game and to buy a new video card, just shouldn't be behavior that anyone accepts. And it suggests other potential motives behind the "optimization" decisions as well, which is in line with what we've seen from other AMD sponsored titles.
An accurate percentage of non-AMD market share is more like 34% (AMD powered consoles hold 58% of the market and then 17% of the remaining 42% share is AMD discrete graphics). So yeah, optimizing for AMD hardware makes the most sense for games with limited development budget. However, Bethesda has both the money and resources to optimize for Nvidia as well before release.
 
Actually, I edited the news post and put some additional comments in there. That statement was mine, and it was directly pointed at this completely ludicrous aspect of the C&C post:

"However, there's really nothing wrong with Nvidia's performance in this game, as some comments around the internet might suggest."

Nothing to see here, people! Nvidia's performance in this game is just fine! We ran some shader analysis, found that Nvidia's shaders are underutilized, and still concluded everything is as it should be!

Chips and Cheese missed the boat on this one, pure and simple. You want a more detailed commentary? Shader analysis only gets you so far if you don't have the source code and debugger to work with. These tools are specifically designed for the developers so that they can make better optimizations. So, when you see GPUs not being utilized very well, that tells the devs they can do better. Any other conclusion is rubbish. Every program out there has lots of untapped potential for optimizations, and it's purely a matter of how much time a company wants to invest versus the potential payoff. We live in a day where a lot of code is simply deemed "good enough" and left at that.

But hey, what would I know as a former software developer? It's not like there's any indication the game is underperforming on Nvidia GPUs other than every single benchmark I've seen. Chips and Cheese looked at three shaders, from a short snippet of run-time behavior. Who's to even say whether or not the segments they analyzed are representative or meaningful? I wonder what Bethesda is even working on with AMD, Intel, and Nvidia's driver teams, since it's all running so amazingly well!

The premise of the article was generally okay, but concluding that Nvidia performance was fine is stupid. Calling the whole of the internet that disagrees incorrect is just the icing on the cake. There are probably thousands of shaders in Starfield, and looking at three doesn't even scratch the surface. Coming to any sort of conclusion, especially when you don't even have access to the game code, is just silly.

This is the problem with DX12 code. You can actually build shaders that work better for one specific architecture, in this case it's clearly AMD's architectures — and note the plural there, because it's not just RDNA 3. It's also RDNA 2, RDNA, Vega, and Polaris. I tested them all. I found that competing Nvidia (and Intel) architectures all showed poor GPU utilization.

I'm not surprised. Bethesda is known for buggy games at launch, often with poor optimization. It couldn't do a game engine with vsync disabled properly until Fallout 76, and even then it required a patch or two before higher refresh rates and non-vsync didn't screw things up. Pushing Starfield out the door when it wasn't properly tuned for a lot of systems isn't even a unique situation. We've seen at least half a dozen major games that had the same thing happen in the past year. And nearly every single one of those games has received patches that in some cases dramatically improved performance.

I bet the devs on those games ran their own code analysis, as well as working with AMD, Intel, and Nvidia, and found more optimal ways of doing things. And that's why our conclusion in this piece is that we're relatively confident that Bethesda will make a lot of changes in the coming months that improve performance and GPU utilization, which runs contrary to the C&C conclusion that nothing is wrong.
Right, at this moment it is too early to say. Heck it might be as simple as AMD’s hardware scheduler doing a better job keeping the shaders utilized than Nvidia’s software scheduler while also freeing up CPU by offloading the GPU scheduling. The point is, C&C brought a hypothesis with limited supporting evidence to the table, other reviewers will either corroborate or bring their own hypothesis to bear.
 
You people are expecting too much out of big publishers and developers.

As a Developer myself, the "just make it work" mentality hurts the product, but it is absolutely real. This is by no means defending the sorry state of any Bethesda title, but you guys can't be cynical on this: you can't expect Companies to optimize to the bone on each and every single piece of hardware and, at least, need to understand they just want the game out and start making money. I sincerely want to believe most developers (like myself) have pride in their work and want to do the best they can, but you will never ever be allowed to drag a multi-million project only because you want it to "run better" over "it just runs" when deadlines are looming.

There's more nuance to it and I have no idea about their time commitments and money flow, but given for how long the game has already been in development, I would dare saying optimizing for nVidia (in particular) and/or Intel, or any CPU uArch (look at how Intel outperforms AMD!) was like a tertiary thought if not even lower in the stack of priorities to get the game out. This is a problem with Bethesda and their own expectation of quality. It is clear given history that "just make it work" is, more or less, their motto for their games. Same for other big AAA publishers and, I'd like to think, developers.

Given sales figures, or what they've been reporting so far, looks like their priorities are validated. And make no mistake, when this happens with nVidia, no one even bats an eye doing a double take on why AMD performs worse on comparable hardware. The trend makes the rule here, it seems and that is hyper cynical.

EDIT: for example this...
View: https://www.youtube.com/watch?v=zDX9qZ0ttz8


Sure, VR is niche, but come on... AMD has been getting so much heat for having issues with VR, but I wouldn't be surprised it wasn't just AMD problems. This hasn't made the news anywhere. Zero. And why? Because the answer is always "well, get nVidia instead!". that's the cynicism I'm talking about.

Regards.
 
Last edited:
  • Like
Reactions: NeoMorpheus
Sep 18, 2023
2
0
10
A new report by Chips and Cheese digs into the low-level details of Starfield's game engine, analyzing shader run times, cache hit rates, and other aspects to try and figure out why there's such a large performance disparity between Nvidia and AMD GPUs.

Starfield Perf Disparity Between AMD and NVIDIA GPUs Analyzed : Read more
Intel arc a770 16gb i9 9900k 32gb

At 2k max settings, no fsr(doesn't work) im getting 35-45 fps average outside, 45-60 fps inside.
But it crashes a lot , like every 25 min , so it's basically unplayable.
 
Sep 18, 2023
2
0
10
AMD cards perform better than usual, must be foul play. But when Nvidia cards perform better than expected... crickets.
I'm getting 30-45 fps max settings 2k native no fsr(doesn't work) 45-60 fps inside. Very playable until it crashes.
Intel arc a770 16gb.
 
Status
Not open for further replies.

TRENDING THREADS