Starfield PC Performance: How Much GPU Do You Need?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

NeoMorpheus

Reputable
Jun 8, 2021
223
251
4,960
I really don’t understand how suddenly, we continue to defend this company that via lock-in tech, has achieved a monopoly and its now abusing it with beauties like 1200 bucks for a 4080 or clearly shafting their own costumers with dlss 3 ignoring the 30 series.

I mean, just look at all these articles showing outrage because their beloved dlss is not there, but you wont hear one peep when fsr is not there.

Worse, you get “subtle” things like this:

developers to "prioritize" AMD technologies. The good news is that FSR 2 works on every GPU; the bad news is that it might not look quite as good as DLSS or XeSS

A consumer will right away ignore that one word and go with the now typical “fsr is trash.”

Or this other beauty:

That's pretty common for games that are promoted by AMD or Nvidia, but the discrepancies in performance between GPUs certainly warrant further scrutiny.

Boohoo, my beloved and free 4090 is not worshipped like a do?

Again, wheres the outrage when its the other way around?

And let me clarify one thing, personally i hate what dlss and fsr “do”, since its a way for developers to ship un-optimized games that would otherwise run like garbage.

But if i have to choose one, it would be one that works on all gpus/brands, not the one that forces me into a brand/company hence limiting my options and in this case, forced into a consumer, open-standards hostile corporation.

Yet somehow, you dont hear any reviewer , Tube influencer and others state or mention that.
 
Last edited:
I really don’t understand how suddenly, we continue to defend this company that via lock-in tech, has achieved a monopoly and its now abusing it with beauties like 1200 bucks for a 4080 or clearly shafting their own costumers with dlss 3 ignoring the 30 series.

I mean, just look at all these articles showing outrage because their beloved dlss is not there, but you wont hear one peep when fsr is not there.

Worse, you get “subtle” things like this:



A consumer will right away ignore that one word and go with the now typical fsr is trash.

Or this other beauty:



Boohoo, my beloved and free 4090 is not worshipped like a do?

Again, wheres the outrage when its the other way around?

And let me clarify one thing, personally i hate what dlss and fsr “do”, since its a way for developers to ship un-optimized games that would otherwise run like garbage.

But we i have to choose one, it would be one that works on all gpus/brands, not the one that forces me into a brand/company hence limiting my options and in this case, forced into a consumer, open-standards hostile corporation.

Yet somehow, you dont hear any reviewer , Tube influencer and others state or mention.
Sadly PC gaming community is full of children. Hence the knee jerk reactions the fanboying, For team red its a normal thing to wait for optimization on launch. On the Green side they are use to it so when it doesn't happen for 1 game the sky is falling and we see what is going on right now on the internet people crying from the Fallout.

Now its AMD is paying them not to allow DLSS based on their feelings and zero proof.

Games from this studio are always a mess on launch but all of that is forgotten from previous games all of a sudden and its now AMD = Bad man.
 

sherhi

Distinguished
Apr 17, 2015
80
52
18,610
I am getting strong "oh no poor monopoly Nvidia guy has to wait a week or two once in a blue moon" vibes.

Anyway I built a 7800x3d with 7900xtx for a friend recently and ofc game runs good on that rig. For my ancient 4790k I bought 6700xt with premium edition and on 1440p and high preset it's like 35-45fps in Atlantis (playable just fine, there is no stuttering so far) and 60+ almost everywhere else. HUB made a video about more optimal manual settings which I have not tried because...well I don't have a problem. GPU is at 98-99% all the time and CPU fluctuates between 50 and 95% but that's rare, loading scenes really use all that CPU power that is available.

I am not super graphically wowed by this game, it looks better on 4k TV and Series X and even high 1440p on PC its okay, but it certainly does not have any "next gen graphics" feel and I don't understand 30fps lock on Xbox, game does not look super amazing to warrant such poor HW performance but that's the current state of new games I guess.
On the other hand, for regular SATA SSD, 4790k (9 years old CPU), 16GBs of DDR3 (some 2100mhz or something) and 6700xt (2,5 years old GPU basically) game runs smoothly, no crashes, no stutters, no glitches that I noticed so far....for this alone I think they have done a pretty good job and there is a high chance modders will improve performance greatly just like they did with Skyrim.
 
Hey Jarred. Do you get to play the games fully? I kinda assume you are too busy to finish games?

I dunno if I will even give this game a try. The graphics look so meh! Now had they used UE5 and made such a game, damn, that would be a killer.
I typically finish a handful of games each year, but not necessarily the biggest games that I end up benchmarking to death. Basically, a game has to have a good story that I want to finish. Grind-heavy games are a non-starter for me. But off the top of my head, the last game I finished was... um... Crud. I can't think of anything right now. I know last year I finished Ascension.
A DLSS mod for nVidia cards is already available on NexusMods. I gather that the performance is fine on most popular nVidia GPUs, even a lowly RTX 2060 6GB like I have with just a bit of tweaking (mix of medium and high). This article has made the nVidia situation seem far worse than it is in reality imo.
You'll note that all of the testing here was done without any upscaling. Except for the 4K chart that used 66% render resolution. So all of the differences in performance are going to be there with and without DLSS or FSR2. And in fact that's what the 4K FSR2 result shows as well.

Having "native FSR2" enabled does have a minor performance hit to all GPUs. It's a bit like DLSS, AFAICT, so you get "better" AA and sharpening without upscaling. That's why I left it on Disabling FSR2 and selecting CAS instead improved performance by around 10 to 15 percent, so that's one "big" option to gain a bit more performance.

Ultimately, though, benchmarks need to be apples-to-apples where possible. So using the same settings on all GPUs should result in similar relative performance. If we disable FSR2 and use CAS everywhere, all cards gain roughly the same amount of performance and the margins are still the same.
 
DX12 is not universal, it is for Windows OS only. It is (or used to! welcome to 2023 gaming industry where games are released half-finished and have ZERO optimization unless you pay the optimization tax) Game Developer's job to make sure their game runs well on majority of GPUs. It is true that AMD sponsored this game but nowhere "fair" that it should run completely BROKEN on other GPU brands. It is truly shameful. We do not see nvidia sponsored games run trash on AMD GPUs do we? What makes sense is that It should run well on all GPUs but better on the sponsored brand. Current state is a greed-fest predatory behavior.
Developers mostly optimize for AMD (RDNA 2) because consoles are RDNA 2 based too.
Devils advocate: How many of nvidias "way it's meant to be played" intentionally invoked features that gimped AMD cards? Answer: a lot more.
 
I'll say this: I'm happy that AMD GPUs get some launch love. But why not optimize for both? Why effectively penalize the other hardware a sizable majority owns? We probably know the answer to that. If Nvidia/DLSS were natively supported the Nvidia hardware would easily surpass the Radeon performance. And it's obvious AMD has some financial sway in this game's development costs.

I remember about 9 months ago the Starfield dev streams actually taking credit saying something like, "even though the game looks great now and could launch today we are going to delay it just to ensure its the fully polished experience we are striving for." So with that reassurance I "pre-ordered" it. 7 months later when i learned it was going to only support AMD FSR and be on Gamepass at launch I hastily, successfully, canceled that purchase and got a refund. And the mainstream review houses are confirming my suspicions. So far, it's "good but not a "great" game.
Happens All the time. Nvidia does it to AMD with Nvidia sponsored titles.

End of the day, AMD paid the money and gave developer time. NVIDIA did not. So who's fault is that really?
 
  • Like
Reactions: NeoMorpheus

aberkae

Distinguished
Oct 30, 2009
132
43
18,610
Yay now do a cpu benchmark obviously using the 7900xtx to mitigate the gpu bottleneck 🤣. 1st day benchmark show the game is heavily dependent on system memory bandwidth possibly and the 7800X3D, 7950X3D, 7700x and 7950X losing significantly to the i9 13900K. AMD can't catch a break.
 
A view from the man on the street: In difficult times like these who has $70 for a new game? My Steam wishlist is still sitting pretty for over a year now with Dead Space, Resident Evil and Cyberpunk waiting patiently for them to selling for $19.99 or less! For that matter I never finished a game in my game library except for Fallout 4 which I am still playing with 168 Nexus mods. Talking about getting a bang for the buck! As I was 'pumping-up' yesterday at Shell on West Olympic Boulevard, Los Angeles we all paid $7.49 a gallon for regular. And I wondered how many are actually out there like me standing at the pump and thinking about Starfield at full price. Of course if I would still live with Mom rent-free, free Internet and unlimited refrigerator rights, I would pre-purchase Starfield, then hole-out in my basement command center, drinking Mountain Dew and gorging myself on Mom's homemade apple pie!
It is your choice to live in LA and pay 7.49 a gallon. Don't pretend that you have to be a basement dweller at moms house to buy a 70 dollar game... I myself am in Huntington Beach and can afford a 70 dollar game once a year easily. Fallout 4, huh? Personally I can't keep myself from going back to Fallout New Vegas with roughly the same amount of mods.
 

NeoMorpheus

Reputable
Jun 8, 2021
223
251
4,960
Devils advocate: How many of nvidias "way it's meant to be played" intentionally invoked features that gimped AMD cards? Answer: a lot more.
The better question would be:

Where were all these “unbiased “ reviewers, writers, YouTubers, white knights, modders, etc when that happened?

Because i have never seen anything actions/demands so stupid as this back then (demanding something that limits your options as a consumer).
 

froggx

Distinguished
Sep 6, 2017
85
34
18,570
i'm getting the impression that some people might be unfamiliar with the "bethesda launch experience," a week-long special event they run with the launch of a new game (participation mandatory). instead I'm seeing people act like this game was released by *pick literally any game dev except bethesda*.

it seems people are mistakenly believing they purchased a finished game, free of any glaring bugs or issues that break all sorts of gameplay, thoroughly vetted in an extensive QA process. try again. internal QA for bethesda starts and ends at "does the game crash enough to be annoying." instead, as part of the "bethesda launch experience," all players are given the choice to opt-in to (opt-in is automatic and disregards free will) a special program which grants the privaledge of QA testing this obviously incomplete product. previous vict-- players have likened this process to swallowing live fire ants while trapped in a sauna... in purgatory.

remember that extra $30 you paid for early access? turns out ppl that make that extra commitment are ready to strap in and ride out this cluster****. your nonconse----- voluntarily performed QA testing is aggregated to churn out a massive bulk of feedback. bethesda uses that data to compress the entirety of the QA process to less than a week, without paying real QA testers a dime (instead it pays for the 50 people they hired to count all that starfield $$$$ coming in).

bethesda spends the duration of the launch experience pumping out patch after patch after patch. we're gonna have all our favorite patches back from previous launches to: 0-day, 2nd 0-day, memory leak, save data autodelete, hopelessly broken multithreading that just runs infinite loops to make CPU% look good, render pipeline with military-grade inefficiency, and so much more. keep in mind that the game engine has origins from at least 15yr ago, was originally single-threaded, and perpetually got extended, transcribed, hacked, cracked, sliced, refactored, renamed, rehashed, reflashed, and reheated into mass of spaghetti code so tangled up with loops and gotos it never exits, and written by people too lazy to do documentation and no longer available to consult. so it's basically a zombie, except harder to work with.

so everyone thought the biggest problem was an AMD bribery scandal to piss in Nvidia's cheerios? no, this kind of thing is expected for a bethesda game launch. it'd actually have been more worrisome of all the gpus performed within expectations off the go. give it a week, bethesda needs about the long to get enough feedback from the "bethesda launch experience" to pump out patches for all the game-breakers. give it around a month, the mod scene will do most of the optimization fixes for free, license them to bethesda so they can make more patches. wait 6 months, bethesda will finally manage to patch something you actually care about.
 
  • Like
Reactions: Tom Sunday

techman69

Distinguished
Jun 12, 2011
2
0
18,510
I just got Starfield 2 days ago, and I have a Nvidia 3070 slightly overclocked and a 5950x with 32 gb, relatively far exceeding the minimum requirements. I run Starfield in Ultra mode at 4K and I have no problems at about 32 FPS as indicated on my G-sync monitor. It runs butter smooth, in the city and in all ship fights. You don't need to spend money a monster GPU, unlike what people around here say. Save your money. Speaking of money, I just got a computer survey from Steam, and I looked at the results. Most people still have a 2060 to 2080 class GPU, and even 580 or 590 class AMD GPU's. According to steam, the total Nvidia market share is 75%, and 24ish percent for AMD. Stop pushing these high end GPU Tom, and focus on price/performance ratio. People barely have enough money to scrape up for gas these days, not buy the latest GPUs. Your site used to be about price/performance many years ago, not just price.
 
This comment section contains tears of the Nvidia fanboys :p

It's a GPU, you win in some games and lose in others. Nvidia has done its share of shady practices; Hello? Nvidia gimpworks anybody recall? AMD tends to make its features opensource, therefore, developers and competitors can use and take advantage of FSR and other features. Whereas Nvidia has almost always kept their software features proprietary. Not sure why any PC folks would support proprietary practice. But again, the PC building community is now mainstream and ain't what it used to be.

Bethesda most likely optimized Starfield to AMD due to the Microsoft acquisition and Xbox's AMD hardware. Given the depending title, they likely have had to do a ton of optimizing to get it run on the AMD based console.
 

HyperMatrix

Distinguished
May 23, 2015
128
134
18,760
That's an outrageous accusation, evidence for your claim?

My apologies. You're right. The company didn't take money from AMD, and not include DLSS from Nvidia, which took a single modder less than a day to release for it. It was a decision made by Bethesda that was purely based on...I'm sure something good. Ignore the money they took from AMD. AMD is a great company that never does wrong and we should defend it. Believe all AMDs.