You have big balls to contradict people who have a deep understanding of how GPUs work and who made extensive tests to come to this conclusion. All of this coming from a news writer without technical work. Arrogant balls.
Actually, I edited the news post and put some additional comments in there. That statement was mine, and it was directly pointed at this completely ludicrous aspect of the C&C post:
"However, there's really nothing wrong with Nvidia's performance in this game, as some comments around the internet might suggest."
Nothing to see here, people! Nvidia's performance in this game is just fine! We ran some shader analysis, found that Nvidia's shaders are underutilized, and still concluded everything is as it should be!
Chips and Cheese missed the boat on this one, pure and simple. You want a more detailed commentary? Shader analysis only gets you so far if you don't have the source code and debugger to work with. These tools are specifically designed
for the developers so that they can make better optimizations. So, when you see GPUs not being utilized very well, that tells the devs they can do better. Any other conclusion is rubbish. Every program out there has lots of untapped potential for optimizations, and it's purely a matter of how much time a company wants to invest versus the potential payoff. We live in a day where a lot of code is simply deemed "good enough" and left at that.
But hey, what would I know as a former software developer? It's not like there's any indication the game is underperforming on Nvidia GPUs
other than every single benchmark I've seen. Chips and Cheese looked at three shaders, from a short snippet of run-time behavior. Who's to even say whether or not the segments they analyzed are representative or meaningful? I wonder what Bethesda is even working on with AMD, Intel, and Nvidia's driver teams, since it's all running so amazingly well!
The premise of the article was generally okay, but concluding that Nvidia performance was fine is stupid. Calling the whole of the internet that disagrees incorrect is just the icing on the cake. There are probably thousands of shaders in
Starfield, and looking at three doesn't even scratch the surface. Coming to any sort of conclusion, especially when you don't even have access to the game code, is just silly.
This is the problem with DX12 code. You can actually build shaders that work better for one specific architecture, in this case it's clearly AMD's architectures — and note the plural there, because it's not just RDNA 3. It's also RDNA 2, RDNA, Vega, and Polaris. I tested them all. I found that competing Nvidia (and Intel) architectures all showed poor GPU utilization.
I'm not surprised. Bethesda is known for buggy games at launch, often with poor optimization. It couldn't do a game engine with vsync disabled properly until Fallout 76, and even then it required a patch or two before higher refresh rates and non-vsync didn't screw things up. Pushing
Starfield out the door when it wasn't properly tuned for a lot of systems isn't even a unique situation. We've seen at least half a dozen major games that had the same thing happen in the past year. And nearly every single one of those games has received patches that in some cases dramatically improved performance.
I bet the devs on those games ran their own code analysis, as well as working with AMD, Intel, and Nvidia, and found more optimal ways of doing things. And that's why our conclusion in this piece is that we're relatively confident that Bethesda will make a lot of changes in the coming months that improve performance and GPU utilization, which runs contrary to the C&C conclusion that nothing is wrong.