Starfield PC Performance: How Much GPU Do You Need?

mhmarefat

Distinguished
Jun 9, 2013
66
76
18,610
This is a complete disgrace for both Bethesda and AMD. I hope all the money they agreed on was worth it (spoiler: it won't be). Unless some last second patch before the game is released is applied for nvidia and intel users, people are not going to forget this for years to come. GJ AMD and Bethesda for disgracing yourselves.
You have a something better than Fallout 4 graphics running 30 fps 1080p on 2080?! What is PC gaming industry reduced to?
 

Dr3ams

Reputable
Sep 29, 2021
251
270
5,060
As I have written in other threads, my mid-range specs (see my signature) play Starfield on ultra at 1440p smoothly. No glitches or stuttering whatsoever. I did have to dial the Render Resolution Scale down to high (which is 62%), because of some shadow issues. In New Atlantis, wandering outdoors, the FPS averaged between 45 and 50. The FPS is higher when you enter a building that requires a load screen. I have tested the game in cities, combat and space combat...again, no glitches or stuttering.

There are some clipping issues that should be addressed by Bethesda. I also have some observations about things that should be changed in future updates.

  1. Third person character movement looks dated. I play a lot of Division 2 (released 2019) and the third person character movement is very good.
  2. Taking cover behind objects in Starfield is poorly thought out.
  3. Ship cargo/storage needs to be reworked. In Fallout 4, you can store stuff in just about every container that belongs to you. Inside the starter ship, Frontier, all the storage spaces can't be opened. The only way to store stuff is by accessing the ship's cargo hold through a menu. No visual interaction whatsoever. I haven't built my own ship yet, so maybe this will change later.
  4. Ship combat needs some tweaking. It would be nice to have a companion fly the ship while the player mounts a side gun.
  5. Movement. The same issue that Fallout has is showing up in Starfield. Walking speed is OK, but running in both the speeds provided is way to fast. This is especially frustrating when trying to collect loot without slow walking through a large area. In settings, there should be a movement speed adjustment. The only way I know to adjust this, is through console commands.
I do love this game though and will be spending a lot of hours playing it. :p
 
Last edited:
lol its nice to have a game optimized for your hardware at launch we usually have the opposite on the red team.

People need to relax it will improve with game patches and driver updates.

as per TPU forum.

"DefaultGlobalGraphicsSettings.json" you get the following predetermined graphics list.

Code:
{"GPUs":
[
    {
        "gpuId" : "NVIDIA GeForce RTX 3090",
        "defaultQuality" : "ultra"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 3080",
        "defaultQuality" : "ultra"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 3070 Ti",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 3070",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 2080 Ti",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 2080 SUPER",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 2070 SUPER",
        "defaultQuality" : "medium"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 2070",
        "defaultQuality" : "medium"
    },
    {
        "gpuId" : "NVIDIA GeForce GTX 1080 Ti",
        "defaultQuality" : "medium"
    },
    {
        "gpuId" : "NVIDIA GeForce GTX 1080",
        "defaultQuality" : "medium"
    },
    {
        "gpuId" : "AMD Radeon RX 6800 XT",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "AMD Radeon RX 6700 XT",
        "defaultQuality" : "medium"
    },
    {
        "gpuId" : "AMD Radeon RX 6600 XT",
        "defaultQuality" : "medium"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 2080",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 2060 SUPER",
        "defaultQuality" : "medium"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 3090 Ti",
        "defaultQuality" : "ultra"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 3080 Ti",
        "defaultQuality" : "ultra"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 3080 Ti Laptop GPU",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 3080 Laptop GPU",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 3070 Ti Laptop GPU",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 3070 Laptop GPU",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 3060 Ti",
        "defaultQuality" : "medium"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 3060",
        "defaultQuality" : "medium"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 4090",
        "defaultQuality" : "ultra"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 4090 Laptop GPU",
        "defaultQuality" : "ultra"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 4080",
        "defaultQuality" : "ultra"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 4080 Laptop GPU",
        "defaultQuality" : "ultra"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 4070 Ti",
        "defaultQuality" : "ultra"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 4070",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 4070 Laptop GPU",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 4060 Ti",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "NVIDIA GeForce RTX 4060",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "NVIDIA TITAN X (Pascal)",
        "defaultQuality" : "medium"
    },
    {
        "gpuId" : "NVIDIA TITAN Xp",
        "defaultQuality" : "medium"
    },
    {
        "gpuId" : "NVIDIA TITAN Xp COLLECTORS EDITION",
        "defaultQuality" : "medium"
    },
    {
        "gpuId" : "NVIDIA TITAN V",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "NVIDIA TITAN RTX",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "NVIDIA Quadro RTX 8000",
        "defaultQuality" : "ultra"
    },
    {
        "gpuId" : "NVIDIA Quadro RTX A6000",
        "defaultQuality" : "ultra"
    },
    {
        "gpuId" : "NVIDIA Quadro RTX 6000",
        "defaultQuality" : "ultra"
    },
    {
        "gpuId" : "NVIDIA Quadro RTX 5000",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "AMD Radeon RX 7900 XTX",
        "defaultQuality" : "ultra"
    },
    {
        "gpuId" : "AMD Radeon RX 7900 XT",
        "defaultQuality" : "ultra"
    },
    {
        "gpuId" : "AMD Radeon RX 7600",
        "defaultQuality" : "medium"
    },
    {
        "gpuId" : "AMD Radeon RX 6650 XT",
        "defaultQuality" : "medium"
    },
    {
        "gpuId" : "AMD Radeon RX 6700",
        "defaultQuality" : "medium"
    },
    {
        "gpuId" : "AMD Radeon RX 6750 XT",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "AMD Radeon RX 6800",
        "defaultQuality" : "high"
    },
    {
        "gpuId" : "AMD Radeon RX 6950 XT",
        "defaultQuality" : "ultra"
    },
    {
        "gpuId" : "AMD Radeon RX 6900 XT",
        "defaultQuality" : "ultra"
    }
]
}
 
Last edited:
Thanks a lot Jarred for this set of tests!

If I may: would you please test CPU core scaling and memory scaling (latency and bandwidth) for Starfield? I have a suspicion you may find interesting data in there to share.

Also, a special look at the nVidia CPU driver overhead vs AMD. I have a suspicion some of the AMD favouring comes from that? I'd love to see if that hunch is on the money or not :D

Regards.
 
  • Like
Reactions: Sluggotg
Thanks a lot Jarred for this set of tests!

If I may: would you please test CPU core scaling and memory scaling (latency and bandwidth) for Starfield? I have a suspicion you may find interesting data in there to share.

Also, a special look at the nVidia CPU driver overhead vs AMD. I have a suspicion some of the AMD favouring comes from that? I'd love to see if that hunch is on the money or not :D

Regards.
The problem is that I only have a limited number of CPUs on hand to test with. I could "simulate" other chips by disabling cores, but that's not always an accurate representation because of differences in cache sizes. Here's what I could theoretically test:

Core i7-4770K
Core i9-10980XE (LOL)
Core i9-9900K
Core i9-12900K
Ryzen 9 7950X

I might have some other chips around that I've forgotten about, but if so I don't know if I have the appropriate platforms for them. The 9900K is probably a reasonable stand-in for modern midrange chips, and I might look at doing that one, but otherwise the only really interesting data point I can provide would be the 7950X.

You should harass @PaulAlcorn and tell him to run some benchmarks. 🙃
 

steve15180

Distinguished
Dec 31, 2007
39
24
18,535
I find this article interesting for a bit of a different reason. I think the common belief is that Microsoft has written DX12 ( or 11, etc.) as a universal abstraction layer for any GPU to use. As was pointed out Nvidia has the lion's share of the market. So, my question is, is DX12 written so generically that neither card would work well without modifications on the part of the game developer? Since AMD is a game sponsor and assisted with development, it's only fair that the work with AMD to ensure their cards run well. Nvidia does the same with their sponsored titles.

Here's what I'd really like to know. Do we really know who's camp has the better card from an unbiased point of view? One of the complaints I read was complaining that the developer optimized for 3 to 6% of the market. Well they helped the developers through an agreement. Just like the Nvida programs of the same type. So, when we benchmark for which card is best, is one truly better than the other? Outside of "the way it's meant to be played", and AMD's program, do developers develop on mostly Nvidia hardware because that's what's more popular? Do game wind up running better on Nvidia hardware because their used more often for development? If both companies were to put the same effort into both cards to maximize for best performance, how would that look on a benchmark? (Also excluding and deliberate programming which hnders one side vs the other.)

I don't know the answers, but it would be interesting to find out.
 
The problem is that I only have a limited number of CPUs on hand to test with. I could "simulate" other chips by disabling cores, but that's not always an accurate representation because of differences in cache sizes. Here's what I could theoretically test:

Core i7-4770K
Core i9-10980XE (LOL)
Core i9-9900K
Core i9-12900K
Ryzen 9 7950X

I might have some other chips around that I've forgotten about, but if so I don't know if I have the appropriate platforms for them. The 9900K is probably a reasonable stand-in for modern midrange chips, and I might look at doing that one, but otherwise the only really interesting data point I can provide would be the 7950X.

You should harass @PaulAlcorn and tell him to run some benchmarks. 🙃
I would love to see how the 12900K behaves with different RAM if I had to pick out of that list. Please do consider it? :D

I hope Paul humours our data hunger and gives us some of his time for testing the CPU front on a more complete way.

Still, whatever comes out of this, I'm still thankful for the data thus far.

Regards.
 

mhmarefat

Distinguished
Jun 9, 2013
66
76
18,610
I find this article interesting for a bit of a different reason. I think the common belief is that Microsoft has written DX12 ( or 11, etc.) as a universal abstraction layer for any GPU to use. As was pointed out Nvidia has the lion's share of the market. So, my question is, is DX12 written so generically that neither card would work well without modifications on the part of the game developer? Since AMD is a game sponsor and assisted with development, it's only fair that the work with AMD to ensure their cards run well. Nvidia does the same with their sponsored titles.

Here's what I'd really like to know. Do we really know who's camp has the better card from an unbiased point of view? One of the complaints I read was complaining that the developer optimized for 3 to 6% of the market. Well they helped the developers through an agreement. Just like the Nvida programs of the same type. So, when we benchmark for which card is best, is one truly better than the other? Outside of "the way it's meant to be played", and AMD's program, do developers develop on mostly Nvidia hardware because that's what's more popular? Do game wind up running better on Nvidia hardware because their used more often for development? If both companies were to put the same effort into both cards to maximize for best performance, how would that look on a benchmark? (Also excluding and deliberate programming which hnders one side vs the other.)

I don't know the answers, but it would be interesting to find out.
DX12 is not universal, it is for Windows OS only. It is (or used to! welcome to 2023 gaming industry where games are released half-finished and have ZERO optimization unless you pay the optimization tax) Game Developer's job to make sure their game runs well on majority of GPUs. It is true that AMD sponsored this game but nowhere "fair" that it should run completely BROKEN on other GPU brands. It is truly shameful. We do not see nvidia sponsored games run trash on AMD GPUs do we? What makes sense is that It should run well on all GPUs but better on the sponsored brand. Current state is a greed-fest predatory behavior.
Developers mostly optimize for AMD (RDNA 2) because consoles are RDNA 2 based too.
 

Dr3ams

Reputable
Sep 29, 2021
251
270
5,060
DX12 is not universal, it is for Windows OS only. It is (or used to! welcome to 2023 gaming industry where games are released half-finished and have ZERO optimization unless you pay the optimization tax) Game Developer's job to make sure their game runs well on majority of GPUs. It is true that AMD sponsored this game but nowhere "fair" that it should run completely BROKEN on other GPU brands. It is truly shameful. We do not see nvidia sponsored games run trash on AMD GPUs do we? What makes sense is that It should run well on all GPUs but better on the sponsored brand. Current state is a greed-fest predatory behavior.
Developers mostly optimize for AMD (RDNA 2) because consoles are RDNA 2 based too.
FSR will run on Nvidia GPUs. And Starfield doesn't run "completely broken" on other non AMD GPUs, with the exception of Intel. But Intel will be releasing a patch for that.
 
I find this article interesting for a bit of a different reason. I think the common belief is that Microsoft has written DX12 ( or 11, etc.) as a universal abstraction layer for any GPU to use. As was pointed out Nvidia has the lion's share of the market. So, my question is, is DX12 written so generically that neither card would work well without modifications on the part of the game developer? Since AMD is a game sponsor and assisted with development, it's only fair that the work with AMD to ensure their cards run well. Nvidia does the same with their sponsored titles.
DX12 and Vulkan put the onus on optimizations far more with the game developers than with drivers. It's possible to do some things with drivers to help, but it's not nearly so easy (or hard) as with DX11. I've written about this in the past I'm sure, but basically low-level APIs really open the door for vendor-specific optimizations.

Now, AMD's architectures tend to be a bit more... "universal?" I'm not sure what the best way is to describe that, but the hardware features match up better with DX12 features than Nvidia does is my understanding. So to get Nvidia hardware to work well with DX12, the developers need to put in more work to write code for Pascal, Turing, and Ada/Ampere. There are enough differences that for sure the GTX cards don't work as well with the same DX12 code as AMD GPUs, or other Nvidia GPUs (though GTX are a dying breed so it's less of a factor now). But more critically, DX12 code written to run well on AMD rarely extracts maximum performance from Nvidia's RTX cards.

This is why a lot of games (Microsoft Flight Simulator is an example) still list DX12 support as beta. Some games never transition the DX12 from beta to full release. But for better or worse, Starfield uses DX12, and like some other AMD-promoted games, it seems like all the hand-tuning went into code that works well on AMD GPUs (including consoles) and everything else just has to make do with what it gets.

If you want an example of the reverse, I can give that as well. Total War: Warhammer 3 only uses DX11 (the previous Warhammer 2 had beta DX12 that never left beta, and was an AMD-promoted game). Anyway, whatever it is about the code, it runs much better on Nvidia GPUs proportional to their AMD counterparts. Ray Tracing games can look this way as well, but mostly that's because AMD's GPUs are generally lacking in RT hardware performance. RT games where AMD is closer to "parity" with Nvidia are often games where the RT effects aren't super pronounced.

So it goes both ways for sure, and there have been other cross-platform games promoted by AMD that didn't do so hot on Nvidia hardware until further down the road. (Hello, The Last of Us, Star Wars Jedi Survivor, Resident Evil remakes, etc.)
 

RedBear87

Commendable
Dec 1, 2021
150
114
1,760
AMD holds 3-6% of the PC GPU market compared to Nvidia? Smart move.

/s
This is a complete disgrace for both Bethesda and AMD. I hope all the money they agreed on was worth it (spoiler: it won't be).
You have to consider that M$ is probably more interested about the game exclusivity for the console market, PS5 players are going to skip this one, since the consoles run AMD hardware it makes sense that it was optimised for it.

It is disgrace for sure, though.
 
De2WN2c8drV5KmtGFvEDGM-1200-80.png



Yep, looks like I was right to not get excited by Starfield, it's unplayable on my 2070 Super even at 1920x1080, much less at 4K.
 

rluker5

Distinguished
Jun 23, 2014
901
574
19,760
I'm sure the Nvidia performance will improve with driver updates. No doubt Arc will. The game isn't even out yet for most.

I'm cautiously optimistic that my mostly leftover stuff build+ a 6800 in my office will be able to hold mostly 60fps at 4k med/67% FSR. It also has 5775c, 16gb 2400c10 DDR3. I'll find out when I get home from work on Tuesday.
 
  • Like
Reactions: JarredWaltonGPU
I'm sure the Nvidia performance will improve with driver updates. No doubt Arc will. The game isn't even out yet for most.

I'm cautiously optimistic that my mostly leftover stuff build+ a 6800 in my office will be able to hold mostly 60fps at 4k med/67% FSR. It also has 5775c, 16gb 2400c10 DDR3. I'll find out when I get home from work on Tuesday.
An i7-5775C? Wow, that could be interesting. If the game really does need 6-core to run well, all the extra cache probably won't matter, but I think there's a reasonable chance that old Broadwell could still do okay. The 6800 will be fine for sure, at least.
 
Don't mean to be picky but...no 3080 but did test a 2080? What gives on that one?
RTX 2080 is listed as the "recommended" GPU by Bethesda. Based on the 3050, you can see that performance between different GPU architectures (Turing, Ampere, Ada) isn't massively different. Which means the RTX 3080 will probably land at the same level as the RTX 4070.

Given the poor performance of Nvidia GPUs right now, I held off on running a bunch more tests. I really think there will be a patch or driver update that increases performance at least 15% within the next month or so.
 
  • Like
Reactions: JTWrenn
RTX 2080 is listed as the "recommended" GPU by Bethesda. Based on the 3050, you can see that performance between different GPU architectures (Turing, Ampere, Ada) isn't massively different. Which means the RTX 3080 will probably land at the same level as the RTX 4070.

Given the poor performance of Nvidia GPUs right now, I held off on running a bunch more tests. I really think there will be a patch or driver update that increases performance at least 15% within the next month or so.
Any chance we get a CPU scaling benchmark from you guys?
 

rluker5

Distinguished
Jun 23, 2014
901
574
19,760
An i7-5775C? Wow, that could be interesting. If the game really does need 6-core to run well, all the extra cache probably won't matter, but I think there's a reasonable chance that old Broadwell could still do okay. The 6800 will be fine for sure, at least.
I'm hoping the memory bandwidth will give it enough, it isn't strong on compute. 60 fps is all I'm looking for.
It does have a conservative oc, fast ram for the time, and AMD GPU scheduler assistance for so it should do close to as well as a 5775c reasonably can.
screenshot-14-jpg.2626804
 

kiniku

Distinguished
Mar 27, 2009
253
74
18,860
You have to consider that M$ is probably more interested about the game exclusivity for the console market, PS5 players are going to skip this one, since the consoles run AMD hardware it makes sense that it was optimised for it.

It is disgrace for sure, though.
I'll say this: I'm happy that AMD GPUs get some launch love. But why not optimize for both? Why effectively penalize the other hardware a sizable majority owns? We probably know the answer to that. If Nvidia/DLSS were natively supported the Nvidia hardware would easily surpass the Radeon performance. And it's obvious AMD has some financial sway in this game's development costs.

I remember about 9 months ago the Starfield dev streams actually taking credit saying something like, "even though the game looks great now and could launch today we are going to delay it just to ensure its the fully polished experience we are striving for." So with that reassurance I "pre-ordered" it. 7 months later when i learned it was going to only support AMD FSR and be on Gamepass at launch I hastily, successfully, canceled that purchase and got a refund. And the mainstream review houses are confirming my suspicions. So far, it's "good but not a "great" game.
 
Last edited:

Dr3ams

Reputable
Sep 29, 2021
251
270
5,060
And the mainstream review houses are confirming my suspicions. So far, it's "good but not a "great" game.
I've watched a number of those and found that most are subjective and not objective. A good review provides facts about the game's real issues. Stating that they didn't like something and then tell readers or viewers they shouldn't play it for that or those reasons is just bad reporting.