Starfield PC Performance: How Much GPU Do You Need?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Nvidia stans losing it because for once a popular release ain't kind to their GPUs. :LOL:
Honestly, there have been very few sponsored by AMD or Nvidia games in the past couple of years that have truly been good. Baldur's Gate 3 is possibly the best game and years. It's not a sponsored game for either company. Meanwhile...

Redfall (Nvidia) was junk
Dead Island 2 (AMD) wasn't great
Star Wars Jedi Survivor (AMD) was a decent game but with many launch issues
The Last of Us Part 1 (AMD) had issues at launch and still takes upward of 15 minutes to compile shaders when you first run it on AMD cards. Ironically, it takes about half as long to compile the shaders on Nvidia GPUs.
Starfield (AMD) so far looks okay, not amazing

I mean, what was the last truly great game that was sponsored by a GPU vendor? Cyberpunk 2077 is perhaps as close as we got, it was heavily Nvidia promoted, but even then it wasn't awesome.
 

Gillerer

Distinguished
Sep 23, 2013
366
86
18,940
I've watched a number of those and found that most are subjective and not objective. A good review provides facts about the game's real issues. Stating that they didn't like something and then tell readers or viewers they shouldn't play it for that or those reasons is just bad reporting.
There is no such thing as an "objective" review of a piece of media or culture. A review will always be affected by the tastes and prior experiences of whoever is writing it.

In fact, the whole point of a review is to give well thought out and nuanced analysis and opinion.
 

Dr3ams

Reputable
Sep 29, 2021
251
270
5,060
There is no such thing as an "objective" review of a piece of media or culture. A review will always be affected by the tastes and prior experiences of whoever is writing it.

In fact, the whole point of a review is to give well thought out and nuanced analysis and opinion.
What? If a reviewer is benchmarking hardware while playing a certain title, and then posts a review of the game based on the benchmark it should not be subjective.
 

Elusive Ruse

Estimable
Nov 17, 2022
452
586
3,220
Honestly, there have been very few sponsored by AMD or Nvidia games in the past couple of years that have truly been good. Baldur's Gate 3 is possibly the best game and years. It's not a sponsored game for either company. Meanwhile...

Redfall (Nvidia) was junk
Dead Island 2 (AMD) wasn't great
Star Wars Jedi Survivor (AMD) was a decent game but with many launch issues
The Last of Us Part 1 (AMD) had issues at launch and still takes upward of 15 minutes to compile shaders when you first run it on AMD cards. Ironically, it takes about half as long to compile the shaders on Nvidia GPUs.
Starfield (AMD) so far looks okay, not amazing

I mean, what was the last truly great game that was sponsored by a GPU vendor? Cyberpunk 2077 is perhaps as close as we got, it was heavily Nvidia promoted, but even then it wasn't awesome.
You are right about Baldur's Gate 3. I just finished my second playthrough and poured 300 hours in it.
The whole sponsorship ruining games is also true, I would say it's almost as bad as adding microtransactions to cover the cost of game development or make a buck.
 

Dr3ams

Reputable
Sep 29, 2021
251
270
5,060
Honestly, there have been very few sponsored by AMD or Nvidia games in the past couple of years that have truly been good. Baldur's Gate 3 is possibly the best game and years. It's not a sponsored game for either company. Meanwhile...

Redfall (Nvidia) was junk
Dead Island 2 (AMD) wasn't great
Star Wars Jedi Survivor (AMD) was a decent game but with many launch issues
The Last of Us Part 1 (AMD) had issues at launch and still takes upward of 15 minutes to compile shaders when you first run it on AMD cards. Ironically, it takes about half as long to compile the shaders on Nvidia GPUs.
Starfield (AMD) so far looks okay, not amazing

I mean, what was the last truly great game that was sponsored by a GPU vendor? Cyberpunk 2077 is perhaps as close as we got, it was heavily Nvidia promoted, but even then it wasn't awesome.
The AMD sponsored game I'm still playing is The Division 2. In my opinion, a very good game.
 

RedBear87

Commendable
Dec 1, 2021
150
114
1,760
I'll say this: I'm happy that AMD GPUs get some launch love. But why not optimize for both?
Time and money are finite resources in the end, it's not hard to make a few guess on why they could not, or did not want to, optimise the game for both Nvidia and AMD (and poor Intel). Still, we should take things a bit in perspective at the moment, Nvidia hasn't even released GameReady drivers and while it IS weird that, for once, Nvidia is lagging behind AMD in releasing optimised drivers, we should wait a few days before making the final calls on this game.
 
Time and money are finite resources in the end, it's not hard to make a few guess on why they could not, or did not want to, optimise the game for both Nvidia and AMD (and poor Intel). Still, we should take things a bit in perspective at the moment, Nvidia hasn't even released GameReady drivers and while it IS weird that, for once, Nvidia is lagging behind AMD in releasing optimised drivers, we should wait a few days before making the final calls on this game.
As noted in the article, the Nvidia 537.13 drivers are supposed to be game ready for Starfield.
 
  • Like
Reactions: RedBear87

Gillerer

Distinguished
Sep 23, 2013
366
86
18,940
What? If a reviewer is benchmarking hardware while playing a certain title, and then posts a review of the game based on the benchmark it should not be subjective.
Game reviews are different to hardware reviews, in that most everything is subjective. (And it's not a "game review" if all they do is benchmark hardware. Words have meaning.)

For a game review, even if you have hard results in the form of benchmarking data, their importance for the final verdict depends on various factors that need to be assessed on a case-by-case basis.

After all if you're dealing with a slow-paced RPG with story and world-building as its focus, anything beyond "good enough" performance isn't nearly as important as the actual content of the game.
 
Last edited:
The problem is that I only have a limited number of CPUs on hand to test with. I could "simulate" other chips by disabling cores, but that's not always an accurate representation because of differences in cache sizes. Here's what I could theoretically test:

Core i7-4770K
Core i9-10980XE (LOL)
Core i9-9900K
Core i9-12900K
Ryzen 9 7950X

I might have some other chips around that I've forgotten about, but if so I don't know if I have the appropriate platforms for them. The 9900K is probably a reasonable stand-in for modern midrange chips, and I might look at doing that one, but otherwise the only really interesting data point I can provide would be the 7950X.

You should harass @PaulAlcorn and tell him to run some benchmarks. 🙃
they did test here some cpus, ryzen is a bit slacking
Starfield-24-CPU-Benchmarks-Welcher-Prozessor-reicht-aus-pcgh.png

 

Sleepy_Hollowed

Distinguished
Jan 1, 2017
535
228
19,270
Seems like a pretty standard performance in a release for a AAA game to be honest, with the exception that for this one AMD has better drivers/optimizations this time around on the 1080p side of things.

Kind of cements my 1080p forever rule, since developers are banking on some niche things like upscaling to get even passable performance on anything higher.
 

sfjuocekr

Prominent
Jan 19, 2023
25
7
535
The last thing we need is another evangelizing of upscaling techniques.

You want to disable upscaling and you want to disable VRS as well.

But I guess with the zoomers these days watching streams the entire time, looking at macro blocked images is "natural"?

I also don't understand people that play in 1080p on a 1440p monitor, you can not integer scale 1080 up to 1440... it will look blurry! Even 720 would look better at that point.
 
  • Like
Reactions: vinay2070

rluker5

Distinguished
Jun 23, 2014
901
574
19,760
The last thing we need is another evangelizing of upscaling techniques.

You want to disable upscaling and you want to disable VRS as well.

But I guess with the zoomers these days watching streams the entire time, looking at macro blocked images is "natural"?

I also don't understand people that play in 1080p on a 1440p monitor, you can not integer scale 1080 up to 1440... it will look blurry! Even 720 would look better at that point.
You probably haven't heard and don't want to know, but the Alchemist series is pretty good at non integer upscaling. Unfortunately you have to install Intel Graphics Command Center Beta from the Windows store to use it.

You can make up a .95x resolution (in the case of 4k that would be 3648x2052) , .90, .85, .67x whatever, and if it is close, like .85x then it will look mostly like native and not be blurry. I think I had to run .85x at 4k and XeSS quality to get my A750 to run CP2077 at 60 fps. It looked pretty good.
And let me tell you, if you were to try native 4k with that card in CP2077 you would be sorely disappointed.

We are in a time when the old rules are becoming obsolete. What matters most is if the image on the screen looks good. How it was made isn't as important.

Starfield is just the next game to show this - where modded in DLSS beats both native and FSR 2.0 in image quality. XeSS will likely as well when Intel gets the game driver out. But I'm playing with gamepass so I don't get to mod :(
 

vinay2070

Distinguished
Nov 27, 2011
294
85
18,870
Honestly, there have been very few sponsored by AMD or Nvidia games in the past couple of years that have truly been good. Baldur's Gate 3 is possibly the best game and years. It's not a sponsored game for either company. Meanwhile...

Redfall (Nvidia) was junk
Dead Island 2 (AMD) wasn't great
Star Wars Jedi Survivor (AMD) was a decent game but with many launch issues
The Last of Us Part 1 (AMD) had issues at launch and still takes upward of 15 minutes to compile shaders when you first run it on AMD cards. Ironically, it takes about half as long to compile the shaders on Nvidia GPUs.
Starfield (AMD) so far looks okay, not amazing

I mean, what was the last truly great game that was sponsored by a GPU vendor? Cyberpunk 2077 is perhaps as close as we got, it was heavily Nvidia promoted, but even then it wasn't awesome.
Hey Jarred. Do you get to play the games fully? I kinda assume you are too busy to finish games?

I dunno if I will even give this game a try. The graphics look so meh! Now had they used UE5 and made such a game, damn, that would be a killer.
 

vinay2070

Distinguished
Nov 27, 2011
294
85
18,870
The last thing we need is another evangelizing of upscaling techniques.

You want to disable upscaling and you want to disable VRS as well.

But I guess with the zoomers these days watching streams the entire time, looking at macro blocked images is "natural"?

I also don't understand people that play in 1080p on a 1440p monitor, you can not integer scale 1080 up to 1440... it will look blurry! Even 720 would look better at that point.
I agree with you. But there are plenty of benchmarks out there, including DLSS. I genrally watch few of them and come to my own conclusion. Every benchmarks will contribute something to that conclusion.
 
The game mostly runs fine, but the graphics are very hit and miss though this doesn't particularly bother me. I have been enjoying myself for the most part which I can't say has been the case for any of the prior Bethesda games I've tried. I found dropping the sharpening down minimized a lot of the obvious visual issues with FSR2, but that may be subjective. I would really prefer games not rely on any upscaling as a crutch, but it seems like this is here to stay.

As I have written in other threads, my mid-range specs (see my signature) play Starfield on ultra at 1440p smoothly. No glitches or stuttering whatsoever. I did have to dial the Render Resolution Scale down to high (which is 62%), because of some shadow issues. In New Atlantis, wandering outdoors, the FPS averaged between 45 and 50. The FPS is higher when you enter a building that requires a load screen. I have tested the game in cities, combat and space combat...again, no glitches or stuttering.

There are some clipping issues that should be addressed by Bethesda. I also have some observations about things that should be changed in future updates.

  1. Third person character movement looks dated. I play a lot of Division 2 (released 2019) and the third person character movement is very good.
  2. Taking cover behind objects in Starfield is poorly thought out.
  3. Ship cargo/storage needs to be reworked. In Fallout 4, you can store stuff in just about every container that belongs to you. Inside the starter ship, Frontier, all the storage spaces can't be opened. The only way to store stuff is by accessing the ship's cargo hold through a menu. No visual interaction whatsoever. I haven't built my own ship yet, so maybe this will change later.
  4. Ship combat needs some tweaking. It would be nice to have a companion fly the ship while the player mounts a side gun.
  5. Movement. The same issue that Fallout has is showing up in Starfield. Walking speed is OK, but running in both the speeds provided is way to fast. This is especially frustrating when trying to collect loot without slow walking through a large area. In settings, there should be a movement speed adjustment. The only way I know to adjust this, is through console commands.
I do love this game though and will be spending a lot of hours playing it. :p
New Atlantis is pretty brutal on the CPU as I noticed I'm CPU bound there, but haven't noticed anywhere else.

Ship cargo hold does have direct access in the cockpit opposite the captain's stash up on the wall. I couldn't figure it out for the longest time until I randomly ran by and saw items in a list.
 
  • Like
Reactions: Dr3ams
A DLSS mod for nVidia cards is already available on NexusMods. I gather that the performance is fine on most popular nVidia GPUs, even a lowly RTX 2060 6GB like I have with just a bit of tweaking (mix of medium and high). This article has made the nVidia situation seem far worse than it is in reality imo.
 
Last edited:
D

Deleted member 2950210

Guest
Well, i guess AMD finally found the only way to surpass 4090's performance. And its name is Starfield. Better enjoy it while they still can, 'cause those kinds of impressions tend to quickly wane away.
 
  • Like
Reactions: Tom Sunday
Sep 3, 2023
1
0
10
1920x1080, FSR2, medium, scaling 100%, 1070TI.

Change it to 1920x1080, CAS, medium, scaling 100%, I have this running with a Gigabyte manufacturer Super-Overclocked 1070GTX 8gb, it sits at variant fps from 30 to 50fps depending on how busy it is on screen, most area's are around 40fps.
My other hardware helps with the fps balance as well, an AMD 4ghz true8core with multithreading turned off via the Bios, 24gb gaming ram, 1tb 990Pro M.2 SSD (7gbps rs) and SoundBlaster Z Pro.
 
Last edited:
Sep 3, 2023
1
1
10
> For native 4K ultra, only the two fastest GPUs from AMD and Nvidia can break 60 fps — and for the first time, Nvidia's top GPU takes the lead. That's not a great showing for Starfield, considering that the 4090 is nearly 30% faster than the 7900 XT at 4K ultra in our standard benchmarks.

69.6/53.9 = 1.29... so... not really out of the ordinary?
 
  • Like
Reactions: Tom Sunday