News Star Wars Jedi: Survivor Patched Performance — AMD and Nvidia GPUs Tested

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
The charts mention "No MB", but what does MB mean in this context?
Motion Blur. I turn that off (always) because it just blurs the images and doesn't make a game look better. Same with Chromatic Aberration, which does some blurring around the outside edges of the display to look "more realistic." Both are very light post processing filters that basically take all the hard rendering work the GPU does and then reduce the quality by blurring. Stupid! IMO, naturally.
 
  • Like
Reactions: KyaraM

atomicWAR

Glorious
Ambassador
Since my former high end gaming rig (as of a year ago LOL) won't play this at 4K/60FPS-Hz, I was hoping that my PS5, the latest updated version, would play it fine. Based on reviews on PS5 forums (and even on places like Amazon), it's also a broken s-show on that console. Terrible graphics, slide show frame rates in some scenes, etc.

EA just apparently can't NOT screw up games. They ruined the F1 series that Codemasters who they bought out had done well with for many years.
I would think you could play on your Jaws PC at 4K epic settings using fsr quality and easily hit 60FPS in most areas with decent .1% lows. Granted that's not native but speaking from experience as I am using those exact setting with a 4090 and the picture quality is excellent. And if your willing to drop things from epic settings you could increase your FPS accordingly.
 
D

Deleted member 2838871

Guest
As for performance, I suspect a lot of it comes down to PC hardware. I have a more or less top-end system. Core i9-13900K, 32GB DDR5-6600, MSI MEG Ace Z790 mobo, 4TB Sabrent Plus-G SSD. It's also running "clean" so there aren't a bunch of background tasks potentially causing problems. If you have 16GB RAM, a Core i5 or Ryzen 5 CPU, slower SSD, and some background tasks running, maybe things are worse.

Absolutely comes down to PC hardware. I have a top end system too and haven't had any issues. I had to quit watching YouTube videos though because it seems they are all about people trying to get the game to perform at highest settings with low end hardware.

I had one reviewer say "The 2700x is a fairly recent CPU." Since when is a 5 year old CPU recent? That'd be like me trying to play on my old 4 core 7700k.
 

zx128k

Reputable
Absolutely comes down to PC hardware. I have a top end system too and haven't had any issues. I had to quit watching YouTube videos though because it seems they are all about people trying to get the game to perform at highest settings with low end hardware.

I had one reviewer say "The 2700x is a fairly recent CPU." Since when is a 5 year old CPU recent? That'd be like me trying to play on my old 4 core 7700k.
The patch increased performance by upto 75% on some systems in some places. This is with the latest hardware.

10900k system on youtube with a 3080 is having no issues. AMD cpus were experiencing massive drops in fps.
The DLSS like mod shows massive uplift in performance using DLSS. Even Cyberpunk 2077 has DLSS, FSR 2 and XeSS. AMD sponsored titles suck.
 
D

Deleted member 2838871

Guest
AMD sponsored titles suck.

Yeah but at least it was free... and apparently Nvidia sponsored titles suck too... at least that's what I keep hearing about Redfail... which was also free. :ROFLMAO: :ROFLMAO: :ROFLMAO:

I generally don't pay money for games unless the reviews are favorable... but in the case of these 2 games it didn't matter.
 
D

Deleted member 2838871

Guest
Redfall is a turd. Good performance but no fun. If I wanted to run around by myself I would go for a run in the desert.

I haven't tried it yet. Downloaded it the other day but it's on the back burner right now. Too many other games I'm playing. I was late to the party for Elden Ring, Hogwarts, Last of Us and RE4. (y)
 

atomicWAR

Glorious
Ambassador
Redfall is a turd. Good performance but no fun. If I wanted to run around by myself I would go for a run in the desert.
My wife and I actually like it in co-op but I could see where playing alone would a vastely inferior experience. Its not up to Arkane's usual standards but I don't think its as bad as some of the press it is getting. That said its fine for game pass but certainly not worth 70 dollars. The map is small though indeed 'dense' as Arkane claims though I am just not sure it makes up for that fact and the price should reflect that I feel like. This was not the game to make the move to 70 dollars with.

My biggest complaint is on the human cultist, though. Their AI is horrid. You can snipe using a silencer and a person standing right next another won't be alerted even though they should see the dead body at their feet. As someone who enjoys sniping from a distance as my wife charges in kicking the hornets nest it can be a bit disappointing seeing cultist have no response to their dead comrades as I pick them off. Making things a twinge easier than it should be. But for a another 'game pass' title, its quite serviceable.
 
10900k system on youtube with a 3080 is having no issues. AMD cpus were experiencing massive drops in fps.
well massive...mine 3800x doesnt drop below 50fps if we arent talking about when game loads assets during climbing/entering something/etc...game manages poorly resources when loading assets

btw was doing some testing and you can speed up game loading time a bit with small config tweak
shaders compile on every run (about 500MB cache on mine system), but its not necesary, only one time is fine
in gameusersetting.ini there is
[ShaderPipelineCache.CacheFile]
LastOpened=SwGame

change it to
[ShaderPipelineCache.CacheFile]
LastOpened=whatever

and game will load much faster now, no side effects as long you had shaders precompiled already
 
Last edited:

KyaraM

Admirable
The "PC Patch 3.5" that dropped on late May 1 or early May 2. At least, I think that's the patch version.

Testing was right at the start of the level, the first time you arrive. I picked that spot because there was nothing else around (meaning, respawning enemies that would attack during my test sequence), and my FPS counter in the corner seemed to indicate a reasonably demanding area — not the most demanding at all, but also more demanding than some of the earlier scenes (i.e. on Coruscant).

This is the difficulty with benchmarking: Where do you test? Playing further into the game to find a more demanding area just takes away from testing time. Also, the village area on Koboh is known to be more (extremely) taxing, in part because it doesn't have any enemies around and has some wide open scenes that can be particularly hard on performance.

I retested one of the cards while running around in the village to see how my original test sequence compared to a potentially more demanding area. The results: Super poor minimum fps, like 30 fps on 1080p medium and epic, as well as 1440p epic, while using an RTX 3080. Average fps was way down for 1080p medium (from ~193 fps to ~101 fps), but the drop at epic was less severe (from ~107 fps to ~86 fps).

I was going to try and grab some better individual settings screenshots as well... but that just proved problematic. I loaded into the game, set the Epic preset, then loaded my save and grabbed a screenshot. Then I exited to the menu and tried changing just one setting (texture quality), followed by reloading the save. That didn't seem to change things at all, even though swapping presets would work. So then I exited the game, relaunched, and loaded the save. Texture quality was definitely lower now, but it seemed like a lot of other stuff was lower as well — almost like using the Texture Quality Low setting ended up making the game behave as though it was on the Low preset! I tried a few other settings (View Distance, Shadow Quality), and ultimately determined that I don't trust changing the individual settings at all right now.

As for performance, I suspect a lot of it comes down to PC hardware. I have a more or less top-end system. Core i9-13900K, 32GB DDR5-6600, MSI MEG Ace Z790 mobo, 4TB Sabrent Plus-G SSD. It's also running "clean" so there aren't a bunch of background tasks potentially causing problems. If you have 16GB RAM, a Core i5 or Ryzen 5 CPU, slower SSD, and some background tasks running, maybe things are worse. Ask @PaulAlcorn to test a bunch of CPUs. (He'll say no. :p )

Okay... I have access to the game again. I'm going to try it on a Core i9-9900K. I bet it runs a lot worse! Back in a bit...

Okay, quick update: i9-9900K is substantially slower in the most demanding parts of the game. Here are 4070 Ti testing results (because that's what's in my 9900K PC right now):

Code:
RTX 4070 Ti 9900K KobohWilds (1080pMed) - AVG: 147.8   1%Low: 109.7
RTX 4070 Ti 9900K KobohWilds (1080pEpic) - AVG: 133.6   1%Low:  95.8
RTX 4070 Ti 9900K KobohWilds (1440pEpic) - AVG:  91.6   1%Low:  64.6

RTX 4070 Ti 9900K KobohTown (1080pMed) - AVG:  74.7   1%Low:  37.0
RTX 4070 Ti 9900K KobohTown (1080pEpic) - AVG:  68.4   1%Low:  32.8
RTX 4070 Ti 9900K KobohTown (1440pEpic) - AVG:  60.9   1%Low:  32.3

Again, I note that a LOT of the game does not run anywhere nearly as poorly as in the Koboh village area. That's the "just wandering around talking to people" part of the game, where smooth performance isn't quite as important.
Ah, I see! Thank you for the answer and additional test. Yeah that area does give much better FPS. I have a 12700K and 32 GB DDR4 RAM. Should be reasonably powerful and paired well with the 4070Ti. I agree that much of the game runs way better, even with RT. My biggest issues were with Jedha; quite a few people experience bad crashing issues there.

Again, thank you for the additional test and answer! Appreciate the effort.
 
  • Like
Reactions: JarredWaltonGPU

zx128k

Reputable
Patch 3.5 fixed some of the performance issues. Really should a game be getting upto 75% increased performance after release? The issue is likely that the game is still tuned for console. People were getting the same FPS at all resolutions. We don't let nVidia away with it, why should AMD get a pass.
 
Last edited:

luissantos

Distinguished
Aug 22, 2009
62
11
18,535
I chalk some of that up to guerilla marketing. AMD has pushed multiple games with higher than usual VRAM requirements, and often you'll get major outcry about poor performance on Nvidia (especially at launch) when that happens — anything to paint Intel or Nvidia in a bad light. The Last of Us Part 1, some of the Resident Evil games, Godfall, God of War... The list goes on. There are a lot of games where I can't help but question the "need" to push VRAM use beyond 8GB, particularly when the visual upgrades provided aren't actually noticeable.

That an AMD-promoted game had issues with Nvidia hardware at launch, and then those issues got (mostly/partially) fixed with driver and game updates within a week, is the real story IMO. We've seen the reverse as well, though not so much recently that I can think of unless you count Cyberpunk 2077 RT Overdrive. Actually, you could probably make the claim of Nvidia pushing stuff that tanks performance on AMD GPUs for any game that uses a lot of ray tracing, though in some cases the rendering does look clearly improved.

Seems that you've updated your response since I've last read it. I don't particularly object to that - the old text was markedly more biased and accusatory.

I do have to deeply disagree with the assertion that higher resolution textures are barely noticeable while RT or PT are clearly noticeable. Even with static images I'd be hard pressed to tell you whether a current-gen game looks better with tracing on or off. For older titles like Quake 2 or Portal, yes, the difference is dramatic, but in modern games the advanced techniques we have to simulate RT produce a "good enough" result at 2-3x better performance.

Also, I had no intention of getting political, but it's fairly hypocritical to call out Hardware Unboxed for being on AMD's camp - even if there are PLENTY of other reviewers exposing the limitations of VRAM (capacity and bandwidth) on the internet - and then publishing an article like https://www.tomshardware.com/news/geforce-rtx-4070-vs-radeon-rx-6950-xt-which-gpu-is-better
 
Seems that you've updated your response since I've last read it. I don't particularly object to that - the old text was markedly more biased and accusatory.

I do have to deeply disagree with the assertion that higher resolution textures are barely noticeable while RT or PT are clearly noticeable. Even with static images I'd be hard pressed to tell you whether a current-gen game looks better with tracing on or off. For older titles like Quake 2 or Portal, yes, the difference is dramatic, but in modern games the advanced techniques we have to simulate RT produce a "good enough" result at 2-3x better performance.

Also, I had no intention of getting political, but it's fairly hypocritical to call out Hardware Unboxed for being on AMD's camp - even if there are PLENTY of other reviewers exposing the limitations of VRAM (capacity and bandwidth) on the internet - and then publishing an article like https://www.tomshardware.com/news/geforce-rtx-4070-vs-radeon-rx-6950-xt-which-gpu-is-better
I publicly apologized to HUB. They do the best they can. As do I.

But for the rest, you have to understand what higher resolution textures really mean.

4K textures matter a bit on 4K displays. You can generally do 2X up sampling of a texture and not lose much in the way of detail. So 2K textures are mostly where we should top out. But some games have 4K and even 8K textures. Those are ways to bloat VRAM use with very little payoff, visually. Even 1K textures are mostly fine, since most polygons don’t cover more than a 1K pixel area in a game.

The Nvidia RYX 4070 vs 6950 XT article was already in the works. Started early in the week. And I knew I’d be called biased for writing it. Such is life. As I said in there, my boss asked me to write a face off. I personally think it’s ludicrous to argue that the 6950 is objectively the better choice overall. Yes, it can be faster in rasterization, but that’s literally the only potential selling point.

10-20% faster, depending on the game selection. 50-70% higher power draw, lacking in the latest features and tech. 20-50% slower, when you look at state of the art game. It’s not a bad card, if you already have it, but for the price it’s not the best option. If RTX 3080 was still available and cost $600, I’d make similar arguments in favor of the 4070.
 
  • Like
Reactions: KyaraM
4K textures matter a bit on 4K displays. You can generally do 2X up sampling of a texture and not lose much in the way of detail. So 2K textures are mostly where we should top out. But some games have 4K and even 8K textures. Those are ways to bloat VRAM use with very little payoff, visually. Even 1K textures are mostly fine, since most polygons don’t cover more than a 1K pixel area in a game.
for this purpose there is feature called texture streaming, which streams only necesary texture resolution based on mip needed, instead of placing whole texture with all resolutions (mips) into vram
it has disadvantage of texture popin as you walk around...but in this game its a bit botched
 
for this purpose there is feature called texture streaming, which streams only necesary texture resolution based on mip needed, instead of placing whole texture with all resolutions (mips) into vram
it has disadvantage of texture popin as you walk around...but in this game its a bit botched
MIP mapping is supposed to handle all the texturing stuff, but it doesn't always work perfectly, and sometimes MIP levels get loaded into VRAM that aren't really needed. So if you have an 8K texture, you'll get 4K, 2K, 1K, 512, 256, 128, and maybe even 64 and lower resolutions all stored in VRAM. Those are all pre-calculated by the game developers so that they're "ideal" quality at each level. In most cases, the 4K and 8K versions of the textures are effectively useless — they provide a very minor increase in image quality, for 4X to 16X the VRAM use (for uncompressed textures). Even 2Kx2K textures might not help image quality much!

And then, because the 4K texture might occasionally get referenced (depending on distance and other factors in the game engine), other things get bumped from VRAM and later have to get pulled back in. This is why a game like Horizon Zero Dawn can still look quite good and only use maybe 4GB of VRAM. A newer game may not look that much better, but the existence of 4K textures mean it can bog down even on cards with 8GB or 10GB of VRAM.

I think we're seeing a lot of that stuff happening with some of the most recent VRAM heavy games (Star Wars Jedi: Survivor, The Last of Us Part 1, and a few others have shipped in 2023 with such problems).
 
MIP mapping is supposed to handle all the texturing stuff, but it doesn't always work perfectly, and sometimes MIP levels get loaded into VRAM that aren't really needed. So if you have an 8K texture, you'll get 4K, 2K, 1K, 512, 256, 128, and maybe even 64 and lower resolutions all stored in VRAM. Those are all pre-calculated by the game developers so that they're "ideal" quality at each level. In most cases, the 4K and 8K versions of the textures are effectively useless — they provide a very minor increase in image quality, for 4X to 16X the VRAM use (for uncompressed textures). Even 2Kx2K textures might not help image quality much!

And then, because the 4K texture might occasionally get referenced (depending on distance and other factors in the game engine), other things get bumped from VRAM and later have to get pulled back in. This is why a game like Horizon Zero Dawn can still look quite good and only use maybe 4GB of VRAM. A newer game may not look that much better, but the existence of 4K textures mean it can bog down even on cards with 8GB or 10GB of VRAM.

I think we're seeing a lot of that stuff happening with some of the most recent VRAM heavy games (Star Wars Jedi: Survivor, The Last of Us Part 1, and a few others have shipped in 2023 with such problems).
nah that game is just botched, runnin configs to force low level mips ignoring higher resolution mips doing nothing, same vram utilisation
enabling/disabling texture streaming no effect, streaming thread utilisation was doing something...but even reducing texture quality in setting has same vram utilisation and doesnt looks like low level textures at all
 
nah that game is just botched, runnin configs to force low level mips ignoring higher resolution mips doing nothing, same vram utilisation
enabling/disabling texture streaming no effect, streaming thread utilisation was doing something...but even reducing texture quality in setting has same vram utilisation and doesnt looks like low level textures at all
Yeah, that’s certainly part of it. If you change presets, the texture quality clearly updates. If you only change texture quality, without restarting the game nothing really happens. But if you fully restart the game, it looks like a lot more than just texture quality gets changed. There are definitely some bugs that still need to be stomped out.

But the VRAM use in other games isn’t borked in the same way. Forza Horizon 5 can exceed 8GB at the extreme preset. Turn it to ultra or high on textures and compare screenshots and the difference is trivial.
 
  • Like
Reactions: KyaraM
ive changed it through configs,so game was closed, game settings shows it was changed from epic to lower setting, not whole preset just textures
But does it actually work properly? Like I said, I changed just texture quality from Epic to Low, leaving everything else on Epic. I then exited the game and restarted. Texture quality had clearly dropped, and the settings showed low textures and everything else on epic. But looking at the resulting image and FPS, it was obvious that more than just textures had changed. The same thing happened when I adjusted just view distance to low. Texture quality after restarting was clearly not at the same as when everything was using the Epic preset. I'll post images in a bit...

How are you editing the configuration files, and what version of the game are you running? Because the main configuration file with Steam is in a binary format stored in:

%UserProfile%\Saved Games\Respawn\JediSurvivor\GameUserSettings.sav

I'm not sure how to edit that file. But what I do know is that if you edit:

%UserProfile%\AppData\Local\SwGame\Saved\Config\WindowsNoEditor\GameUserSettings.ini

Well, you can modify that all you want, but the game still reads the GameUserSettings.sav file and overwrites whatever is in GameUserSettings.ini.
 
For reference, this is image quality comparisons between the presets and "custom" settings where everything is on Epic other than the specified setting. You can clearly see that the custom settings are broken, and as indicated above, editing the GameUserSettings.ini file doesn't work (at least for me with the Steam version of the game).

Epic preset
SWJS-IQ1-(101)-Epic.jpg

High preset
SWJS-IQ1-(102)-High.jpg

Medium preset
SWJS-IQ1-(103)-Medium.jpg

Low preset
SWJS-IQ1-(104)-Low.jpg

View Distance Low
SWJS-IQ1-(105)-View-Distance-Low.jpg

Shadow Quality Low
SWJS-IQ1-(106)-Shadow-Quality-Low.jpg

Anti-Aliasing Low
SWJS-IQ1-(107)-Anti-Aliasing-Low.jpg

Texture Quality Low
SWJS-IQ1-(108)-Texture-Quality-Low.jpg

Visual Effects Low
SWJS-IQ1-(109)-Visual-Effects-Low.jpg

Post Processing Low
SWJS-IQ1-(110)-Post-Processing-Low.jpg

Foliage Detail Low
SWJS-IQ1-(111)-Foliage-Detail-Low.jpg

Ray Tracing On
SWJS-IQ1-(112)-Ray-Tracing-On.jpg


Now, if you look at those images, it's obvious that the texture quality is downgraded on all of the individual settings (except ray tracing, because that's not part of the presets I guess). The FPS counter in the corner (this is on RTX 3080) also clearly indicates something is borked. It was 181 for the low preset, 162 for medium, 126 for high, and 104 for epic (with 86 for epic plus RT).

With the individual settings, view distance alone suggests it bumped performance to 165, slightly better than medium preset. But then the same is true of every other setting! Shadow quality gets 162 (ties medium), anti-aliasing gets 162, texture quality gets 159 (just below medium preset), visual effects at 174, post processing at 159, and foliage detail at 174.

This is what I was saying about the individual settings and "custom" preset being broken. Every one of the "advanced settings" screenshots looks like it's using probably the medium preset at best. Actually, most look like low textures, and some of the other settings look downgraded as well. In theory, they should look more like the Epic preset except for the one specific setting (i.e. so shadows would change, but nothing else).

Here's another set of images, doing the same thing. (I don't know why the camera changes slightly between loads, but it does. Oh well.)

Epic preset
SWJS-IQ2-(101)-Epic.jpg

High preset
SWJS-IQ2-(102)-High.jpg

Medium preset
SWJS-IQ2-(103)-Medium.jpg

Low preset
SWJS-IQ2-(104)-Low.jpg

View Distance Low
SWJS-IQ2-(105)-View-Distance-Low.jpg

Shadow Quality Low
SWJS-IQ2-(106)-Shadow-Quality-Low.jpg

Anti-Aliasing Low
SWJS-IQ2-(107)-Anti-Aliasing-Low.jpg

Texture Quality Low
SWJS-IQ2-(108)-Texture-Quality-Low.jpg

Visual Effects Low
SWJS-IQ2-(109)-Visual-Effects-Low.jpg

Post Processing Low
SWJS-IQ2-(110)-Post-Processing-Low.jpg

Foliage Detail Low
SWJS-IQ2-(111)-Foliage-Detail-Low.jpg

Ray Tracing On
SWJS-IQ2-(112)-Ray-Tracing-On.jpg
 
ah yes now i see it, i wasnt using low textures before so i havent noticed, textures near me are maxed on both epic or low, textures far away from me like that ugly mountain are low (on low texture setting), that exlains still high vram as it still uses high resolution textures even on low setting
E3Cxyt0.jpg

wBHGzqV.jpg

epic vs low
 
ah yes now i see it, i wasnt using low textures before so i havent noticed, textures near me are maxed on both epic or low, textures far away from me like that ugly mountain are low (on low texture setting), that exlains still high vram as it still uses high resolution textures even on low setting
Like I showed above, if you use the Epic preset on Graphics Quality, and then set Texture Quality to Low, you'll get very different results before/after restarting the game. If you restart with just Texture Quality on Low, it seems like a bunch of other stuff gets changed to low/medium quality as well. In fact, it happens with every advanced setting. Just another bug in the database I suppose.
 
Like I showed above, if you use the Epic preset on Graphics Quality, and then set Texture Quality to Low, you'll get very different results before/after restarting the game. If you restart with just Texture Quality on Low, it seems like a bunch of other stuff gets changed to low/medium quality as well. In fact, it happens with every advanced setting. Just another bug in the database I suppose.
i did restart game, and picture shows just terrain texture dropped