News Intel Drops Third 'Starfield' Driver as Bethesda Claims A770 Doesn't Meet Min Specs

Status
Not open for further replies.
Buy a crap GPU, get a crap experience.
More like "buy a crap game, get a crap experience". : D

Really, it should be up to game developers to make sure their games function correctly on all modern hardware, not the graphics card companies, who generally won't have access to a game until close to its release. For some small indie game without access to a wide variety of hardware I could see something like this happening, but not for one of the biggest-budget games ever made. With 500+ developers and a budget in the hundreds of millions of dollars, there is no good excuse for compatibility issues with a line of graphics cards that has been on the market for almost a year.

In general, Bethesda's games are known to be a buggy, broken mess on release, and the company tends to never bother to fix most of the issues with them, instead relying on modders to jerry-rig fixes together in the months and years following a game's launch. Unless you are on a console or it's a live-service game, in which case you tend to be stuck with whatever they give you. People have usually been surprisingly lenient toward the issues with Bethesda's games though.

Even so, Starfield hasn't exactly reached the same level of reception as some of their prior big releases, probably down to more people beginning to recognize the quality control issues with their games following disasters like Fallout 76. So far it's only managed metascores in the mid-80s from professional reviewers, and currently a user score of 5.5 on metacritic, and 80% on Steam, which is a bit low for a hyped AAA game release with a massive budget. Looking through user reviews, there seems to be a lot of legitimate negative sentiment about the game. There's probably a decent gaming experience in there, but there's also little doubt that it has a fair number of issues as well.
 
Buy a crap GPU, get a crap experience.
That implies that the Starfield experience isn't crap on a good GPU, though, which is not necessarily the case. Your mileage may vary, obviously, but the game currently has 5.5 (user rating) on Metacritic and a lot of people seem annoyed with the lack of innovation on Bethesda's part.

I mean, "Skyrim in space" doesn't really sound as big a selling point when you stop to realize that Skyrim, which was never a vehicle of technical or gameplay innovation to begin with, is already 12 years old. There are some things people expect today that weren't even a thing back then, and no, I do not necessarily mean microtransactions.
 
Pretty good thread today.

I'm not going to lie, Intel looks really bad with this and it might be correct for the rep to have told them it's not supported, but they should've said officially certified or something.

That being said, this is a far cry from Bethesda's Oblivion launch, which I got on DVD on launch day without issues whatsoever.
 
Buy a crap GPU, get a crap experience.
That may or may not be true, I don't own an Intel card to say one way or the other, but when these cards are supposed to have performance that exceeds min spec...I tend to lay at least some blame at the feet of Bethesda. Plus as new as Intel is to the GPU market they've exceeded my admittedly low expectations for them in the GPU space. The behavior of both Ngreedia and AMD should make us all want Intel to be more competitive in the GPU space, heck I would love a fourth player in the space even more (no Chinese maker Moore Threads doesn't count, yet)...
 
Buy a crap GPU, get a crap experience.
this here is prime example of a stupid comment.



Yes, the Intel gpu arent perfect, but they are far from crap. (especially as they are not greed priced)

and spec wise card should be fine.

driver issues are normal for all gpu vendors. (yes even nvidia has em)

I'd blame game for pisspoor optimization before calling any modern gpu "crap". (as starfield is EXTREMELY poorly optimized which is a growing trend for game devs as same thing happened with remnant 2)
 
Buy a crap GPU, get a crap experience.

Yeah, that pretty much sums it up. It may look like an oversimplification, but, sadly, it's true.

Not denying Bethesda also have their fair share of blame on the matter, as they could 've done a much better job in optimising the game. For example, i'm still disappointed with them, for excluding DLSS from Starfield.

However, i believe that even the worse of these first days issues can be easily addressed through subsequent updates. The weakness of Intel's GPUs, however, is highly unlikely to be fixed by a patch.

It's a fine company and, in time, i'm sure they'll improve - in fact, i really hope so, 'cause PC gamers could definitely benefit from some healthy competition in the GPU market.
 
Last edited by a moderator:
More like "buy a crap game, get a crap experience". : D

Really, it should be up to game developers to make sure their games function correctly...

Did Bethesda mentioned Intel GPUs on their compatibility list? No.

Intel tried to cut corners on so many ways (ie not DX9 native support) and now they pay the price by trying to fix EVERYTHING through driver updates (1.2Gb driver - wtf)!

Seriously, Intel discrete GPU division doesn't have a single chance of survival. They desperately trying to fix things while the GPU prices are back to normal, while AI is the next step. They are behind and in such a debt ($2.1b before a year or so) that is is only a matter of time before disbanding the whole thing.

ps. Why Bethesda should be responsible for any product? It's like claiming that if there are 10 LAN card or 5 audio card manufacturers, Bethesda should write their code according to their driver and not manufacturers to sell decent products that will comply with universal standards.
 
Saying game studios should optimise for GPUs that are nowhere to be found on Steam (main selling platform for PC version) HW survey (hidden in those 10% of "other cards") is pretty farfetched and that's omitting the fact those GPUs have been nothing but trouble since they were released. These things are always about potential profit vs cost associated with it and who would have thought it wasn't worth it to spend resources on "0,0something" % of customers.
 
ps. Why Bethesda should be responsible for any product? It's like claiming that if there are 10 LAN card or 5 audio card manufacturers, Bethesda should write their code according to their driver and not manufacturers to sell decent products that will comply with universal standards.

Saying game studios should optimise for GPUs that are nowhere to be found on Steam (main selling platform for PC version) HW survey (hidden in those 10% of "other cards") is pretty farfetched and that's omitting the fact those GPUs have been nothing but trouble since they were released. These things are always about potential profit vs cost associated with it and who would have thought it wasn't worth it to spend resources on "0,0something" % of cucustomers.
By this logic, why should games be forced to run on AMD hardware? Their market share is too low to care, right?
 
  • Like
Reactions: KyaraM
By this logic, why should games be forced to run on AMD hardware? Their market share is too low to care, right?
Which games are forced to run on AMD HW? Is there any law forcing game studios to provide 100% gaming experience for all HW manufacturers or something?

I have no idea what you are talking about, you know this is primarily console (AMD HW) exclusive game with PC port right? And is working on Nvidia and Intel (CPU) HW just fine, right? Once Nvidia drivers catch up it will be the same. No wonder it runs better on AMD HW lol, it has been developed for it so a head start is just logical.
If Intel is fixing these issues via driver updates then it's clearly a driver issue, like I'm honestly curious how is game studio supposed to fix driver related issues of some HW components.
 
Intel tried to cut corners on so many ways (ie not DX9 native support)
like they are only ones to do so?
this is their 1st dgpu in how long?
and again for its price its fine.

it keeps up with amd and nvidias modern gen more often than not.

and no if you REALLY wanna see "cutting corners" look at nvidia and their 4060's 128bit bus and reliance on dlss 3's frame generation.
if you arent in a dlss 3 game ur barely getting anything better than a 3060....and at times the 3060 is better.

and AMD is poster child of how Driver updates and be huge for gpu performance long term.

and then lets go to bethesda: why does a game as limited as starfield have such high requirements?
Becasue its extremely poorly optimized.

The games not S tier visually.
The games got more loading screens than some decade old games. (having to load ur tiny spaceship or having to load a city store that is a small room with a vendor when you loaded entire city already? like wtf)
The games got input delay for just looting items.

There is zero reason Intel's dgpu should not "meet min spec" when spec wise it does meet it.

Not saying Intel is great but its by far from bad. (even the dx9 issue is mainly irrelevant now)
 
  • Like
Reactions: KyaraM
Saying game studios should optimise for GPUs that are nowhere to be found on Steam (main selling platform for PC version) HW survey (hidden in those 10% of "other cards") is pretty farfetched and that's omitting the fact those GPUs have been nothing but trouble since they were released. These things are always about potential profit vs cost associated with it and who would have thought it wasn't worth it to spend resources on "0,0something" % of customers.
I guess Bethesda shouldn't bother with supporting the RX 7000 series, then. Only the 7900XTX is on the Steam charts, and it only showed up a month ago with <0.25%.
 
GPU minimum requirements are generally not a question of "muscle" but technologies support.
Intel Arc A770 support Shader Model 5.1
Nvidia GTX 1070 Ti support Shader Model 6.2

Starfield seems to use a version above 5.1 and Intel scrumble to add functions used by Starfield to make shaders work correctly.
 
  • Like
Reactions: -Fran-
like they are only ones to do so?
this is their 1st dgpu in how long?
and again for its price its fine.

it keeps up with amd and nvidias modern gen more often than not.

and no if you REALLY wanna see "cutting corners" look at nvidia and their 4060's 128bit bus and reliance on dlss 3's frame generation.
if you arent in a dlss 3 game ur barely getting anything better than a 3060....and at times the 3060 is better.

and AMD is poster child of how Driver updates and be huge for gpu performance long term.

and then lets go to bethesda: why does a game as limited as starfield have such high requirements?
Becasue its extremely poorly optimized.

The games not S tier visually.
The games got more loading screens than some decade old games. (having to load ur tiny spaceship or having to load a city store that is a small room with a vendor when you loaded entire city already? like wtf)
The games got input delay for just looting items.

There is zero reason Intel's dgpu should not "meet min spec" when spec wise it does meet it.

Not saying Intel is great but its by far from bad. (even the dx9 issue is mainly irrelevant now)
Just to be pedantic: DG1 exists and it is a discrete GPU. So, it would be 2021 and ARC launched in... 2022? So about a year, give or take some months.

Also, having iGPUs also counts towards making their discrete GPUs work, so please don't give them a pass so lightly.

Intel has had YEARS to work on a proper GPU driver stack, but have failed miserably to do so and have now scrambled to try and put something together from broken pieces of their craptastic iGPU ones.

Also, what AkroZ said is pretty spot on and I didn't even notice.

Regards.
 
I'm able to run StarField on a GTX 980 TI, I'm pretty sure an A770 can handle it with the proper support.
This is a perfectly fine example of HOW this duopoly works and WHY it's almost impossible for any other foreign company (cough.. from China for exp.) can simply NEVER make it into this market.

Game studios have locked in contracts with Hardware manufactures and if it wasn't for the "anti-competition laws" (more for just to show off anyway) then it would've been a monopoly anyway.

An "outside" entry simply won't happen when corrupted manufacturers work like this openly without any ramifications.
 
  • Like
Reactions: KyaraM
Status
Not open for further replies.