News SSD Usage In Starfield Is Causing Stuttering Issues: Report

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Sounds like another AMD problem.

I have an Intel i7 11700K, 64GB DDR4 RAM, a nVidia RTX3070Ti, and a 2TB Sabrent Rocket 4 plus SSD, and haven't seen anything like this.

(20+ years of reading Tom's Hardware, and I just now decided to register on the forums)
I made this with Geforce Exp a couple hours ago and my Optane is getting 100% read utilization spikes. But I'm not getting big stutters. Doesn't change the fact that this game shouldn't be maxing out that drive in its fastest aspect when you are just running around. Which you can see in the video.

View: https://www.youtube.com/watch?v=ktFIfhKtHcI
 
I made this with Geforce Exp a couple hours ago and my Optane is getting 100% read utilization spikes. But I'm not getting big stutters. Doesn't change the fact that this game shouldn't be maxing out that drive in its fastest aspect when you are just running around. Which you can see in the video.

View: https://www.youtube.com/watch?v=ktFIfhKtHcI
Thanks for the update.

So this being an AMD problem is Null and Void!
 
  • Like
Reactions: rluker5
Something is fishy here. I'm having a lot of issues with Starfield (the most maddening being that random spots in the game are at probably as low as 240p while spots even just right next to them are not -- I can literally stand without moving and see sections in 240p while others are 720 and still others at native, but it seems to really really like not running native even though I disabled dynamic resolution and upscaling which the descriptions clearly say make it run at native only,) but stuttering just isn't one of them. I don't have the greatest SSD in the world by any means. It's very mid-range -- some might even consider it low end by their standards. Clearly it isn't that the game requires a faster SSD, but that there is some sort of bug causing it to read data much more than it even should. (Uh, I hope it's just read??? If it's writing that much we have a major major problem.) Perhaps there could be some sort of data reading equivalent to a memory leak?

I do have my CPU set to a fixed speed to prevent sudden demand hitching (like flying on and off a world in games like Empyrion or No Man's Sky when it suddenly has to do terrain generation or whatever) but I don't think that should affect this. (There is a special kind of iron IMO that modern CPUs are actually literally too powerful for modern games. I see usages in many as low as something like 7% or even 4% in a few. I think I recall seeing mine scale below 2GHz in many seemingly "big" games. Unfortunately, since they go through steps up to the full speed, that adds a lot of latency on sudden demand situations while it goes up fully.)
 
I traced the load from Starfield via Windows Peformance Recorder and tried to analyze the load via Windows Disk Trace Visualizer. For traversal from one end of New Atlantis to the other, in terms of disk service time, almost 100% of it were 64kb random read and 64kb seq read.

A 990 Pro will beat a 905p at 64kb seq. ATTO QD1 64kb read (seq): 1.61GB/s vs 2.52GB/s (905p vs 990 Pro). Since 990 Pro is a consumer nand SSD, the random 64kb is somewhere around a fourth of 2.52GB/s or ~0.63GB/s vs 905p's probably around 1.61GB/s. (At least that's what Newmaxx told me for 4k, I can ask about 64k). So I would *imagine* that a 905p will outperform a 990 Pro overall and I don't think the issue is 4k rnd reads.

T700 gets ~2.75GB/s at QD1 64kb seq. It's going to be lower in random, and random is already longer service time in total than seq on my 905p which is likely slower in seq and faster in random. So it's not that surprising to me that their drive is hitting 100% util and it doesn't prove the transfer sizes are below 64kb AFAIK.

I can know better how my 905p and 990 Pro perform at 64kb probably with iometer? I might look into that. I don't have a T700 to bench 64kb random though.

Here is trace from 905p
View: https://i.imgur.com/GYhjrJZ.jpg
Can you by chance test the same scenarios with a lower tier SSD, and see if there's any discernable performance difference?
 
I have spend 68hr in the game and the only time I experience Stuttering issue is when I use trainer to increase character movement speed to 10x during planet exploration, otherwise everything were fine with default movement speed.

I have 7950X, 32G DDR5-6400, RTX 3070 and 1TB Samsung 980 pro.
 
Thanks for the update.

So this being an AMD problem is Null and Void!
I think everyone who has been playing games has gotten used to the initial 1 or 2 seconds after loading a save or new scene being not perfect, sometimes things are still popping in, sometimes a bit of stutter. Then it should go away. It happens often enough I don't even notice anymore and I have to look for it to see it.

Having the SSD and CPU work nearly as hard as loading a save or a new cell 3 times in the first 3 minutes when you are supposed to just be steadily playing in one area doesn't seem normal. That's just asking for initial load time behavior to happen all sorts of times it shouldn't.
 
This will be one of many issues they have to fix, glad I haven't jumped on the early adopters boat & swam with this feast of bugs.. .yet. Won't be long for Beth to put out patches I 'd say.
Bet there is an "Unofficial Starfield Patch" already in the works, just like they did with Skyrim & FO4.
 
Why even buy the game yet? Stuttering, bad textures, empty worlds, terrible UI. Don't reward the publisher until they release the promised Engine Kit, they are relying on us to do the work after all.
 
Sounds like another AMD problem.

I have an Intel i7 11700K, 64GB DDR4 RAM, a nVidia RTX3070Ti, and a 2TB Sabrent Rocket 4 plus SSD, and haven't seen anything like this.

(20+ years of reading Tom's Hardware, and I just now decided to register on the forums)
Damn, all that AMD hate you had to bottle for twenty years until an SSD issue on a videogame put you over the edge and made you create an account just to attribute it to AMD. Good job, good effort!
 
Yeah, this is my might peeve at the moment. Games, that on release date are only beta versions. It's up to (us) the paying public, have to put up with a donkey of a game, New AAA games take a full year of updates, fixes etc to be playable. All the while those paying for the game in this form are being shafted. People are buying new rigs for this game. And it runs like a turkey!
Not defending game studios but you people keep buying the game on release day.
Insanity: Doing the same the over and over again but expecting a different outcome.

Stop buying half baked games and hit the studios in the pocketbook.
Only way to change their behavior.
 
Not defending game studios but you people keep buying the game on release day.
Insanity: Doing the same the over and over again but expecting a different outcome.

Stop buying half baked games and hit the studios in the pocketbook.
Only way to change their behavior.
This is exactly why I wait for reviews and optimization updates before I purchase a game. I am planning on getting Jedi survivor but I haven't finished Fallen Order yet so I will finish that and then purchase Jedi survivor.
 
  • Like
Reactions: cknobman
A PC with their specific hardware configuration.
not even then.

Nothing in Starfield is "amazing" to use even a 70 tier gpu.

Its graphically nothing new or special.

CP2077 (which i dont really care for gamewise but i dont hate on it for not looking good) is all around better & less loading screens.

Starfield has no actual reason to be anywhere enar as resource demanding as it is on any modern hardware.
 
  • Like
Reactions: Order 66
Yeah, this is my might peeve at the moment. Games, that on release date are only beta versions. It's up to (us) the paying public, have to put up with a donkey of a game, New AAA games take a full year of updates, fixes etc to be playable. All the while those paying for the game in this form are being shafted. People are buying new rigs for this game. And it runs like a turkey!
This is not a new phenomenon. The first unreal didn't even run on my system till after it was patched. Buggy releases are standard.
 
  • Like
Reactions: Order 66
What desktop motherboards will take 192GB of ram?

99% top out at 128GB.

Secondly trying to use ram to cover up shoddy work is not the direction we want to go.
Basically any ddr5 4 dimmer can do 192gb, it'll run atrociously slow for ddr5 but so will 128gb, they're both 2dpc and dual rank so they're functionally equally bad.
 
I don't really think even poorly optimized games will benefit with that much RAM anyway. The benefits past 8GB often enough are more just offsetting other things (too many people keep a browser with 20 tabs open in the background just for example.) Anything beyond 16GB isn't going to directly benefit gaming itself except in regards to offsetting other things that possibly shouldn't be running while you do heavy gaming anyway. Really that amount of RAM is pretty much only good for some pretty high end CAD stuff.

Otherwise the only benefit I could think of for gaming might be if you went completely nuts and just loaded an entire game onto a RAM drive. That could certainly help with an issue like this, but it's insanely inefficient and messy with a lot of potential problems (not to mention the fact you'll have to reload the RAM drive every time. ) Though I admit the largest problems (like time to load the stored data into RAM) become less of an issue with non-crappy SSDs and pretty much any modern RAM (though it still means wear on a SSD if you actually write out any changes.) This isn't so much fixing an issue as it 's just trying to force a messy workaround though. As big as this game is though, you probably would need more than 128GB even to pull this off.
Not defending game studios but you people keep buying the game on release day.
Insanity: Doing the same the over and over again but expecting a different outcome.
Point of clarity: it's the publishers that are usually the ones who force devs to release games before they're really ready. I do agree though, people need to put their collective foot down and stop paying extra to join an unofficial pre-beta program on day one. Normally beta testers get paid. Instead we have a situation where we're supposed to pay to do that work for them.


One thing I'm wondering in regards to SSD usage is if there may be some specific thing going wrong causing all this runaway reading. As we know, this game is made for the Xbox One first and foremost and while you can put SSDs in them, I don't think any are going to compare to the ones many PC users are using and having problems.

One thing I'm wondering is something similar to what this article highlights in GPU usage: https://www.reddit.com/r/pcmasterra...case_you_wanted_to_know_a_few_reasons_on_why/ It seems it's improperly queuing stuff for the GPU, which naturally has an impact. It makes me wonder, what if it's also similarly improperly queuing disk reads or something similar? Apparently the issue with the GPU queuing is it's making the GPU do work ahead that turns out to be wrong and has to just get discarded and redone. If it did a similar thing with disk reads, say reading the wrong things and then just discarding them only to once again read the wrong things, would it not do what people are seeing?
 
  • Like
Reactions: Countess_C
As we know, this game is made for the Xbox One first and foremost and while you can put SSDs in them, I don't think any are going to compare to the ones many PC users are using and having problems
?
Starfield isn't even being released for Xbox One. It's only on Xbox S/X (and Windows), and both of those come with SSDs.

Edit: The Xbox S/X has a PCIe 4.0 x2 interface, and there is dedicated HW compression/decompression blocks to improve effective bandwidth. So I would expect the Xbox storage to be on par with (maybe better) than a reasonably fast PCIe 3.0 x4 SSD, which I thought was still considered pretty effective storage for a gaming rig.
 
Last edited:
?
Starfield isn't even being released for Xbox One. It's only on Xbox S/X (and Windows), and both of those come with SSDs.
I meant "the current Xbox system." I don't keep up with them. Especially since MS's naming schemes leave rather a lot to be desired sometimes. The point remains: it's built for console first, PC second and those consoles don't have the level of hardware people are seeing issues on. So a game optimized to run tolerably on such a console should not be pushing high end SSDs on a PC. It means something is up here.
 
not even then.

Nothing in Starfield is "amazing" to use even a 70 tier gpu.

Its graphically nothing new or special.

CP2077 (which i dont really care for gamewise but i dont hate on it for not looking good) is all around better & less loading screens.

Starfield has no actual reason to be anywhere enar as resource demanding as it is on any modern hardware.
But that's comparing a game like CP2077 which has already been our for quite some time now & patches released with a completely different game engine & very new one too like starfield is today. Like all games & apps too, updates will appear after they have been out in the 'wild' for a while.