News Pre-Launch 'Jedi: Survivor' Eats 21GB of VRAM, Struggles on RTX 4090

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
D

Deleted member 2838871

Guest
Well after having controller connection difficulties that ended with me having to update firmware I just played it for an hour and it definitely didn't suck.

dClbpb8.jpg


The stats in the upper left is pretty much where I was the whole time... 70-80% GPU... roughly 40% VRAM.... 20% CPU...

So where's the bottleneck? :ROFLMAO: :ROFLMAO:

I had everything at Ultra (Epic) with RT on... this is a pretty small sample size so gonna check it out more tomorrow and see how the performance is... but so far it wasn't bad at all. I had one frame drop when I was climbing up a wall... but after exiting and reloading the game the 2nd time it didn't happen. So I dunno.

That was the only time frames weren't at 60 or within a couple fps. All the "OMG it's running at 30-40 fps" that I saw in online reviews today didn't happen to me.
 

Tac 25

Estimable
Jul 25, 2021
1,391
421
3,890
Sometimes I think people live in a bubble in their head and no nothing. While it is nice to have lots of VRAM in a GPU it should not be necessary to have lots of it to enjoy modern games.
I have several top tier GPUs but am not only annoyed but losing respect for developers in general. 8gb of VRAM should be plenty to play any game at 1080p on any GPU from the 10 series Nvidia or 5000 series AMD up to today.

More than 50% of gamers don't have high end gpus and likely most of them can't just run out and buy a new GPU to play an already expensive new game.

Developers have to make PC games be able to play on any reasonably specced PC. Not 16gb and up GPUs only. That's not very in touch with consumers.

simply glad the two fighting games I'm eagerly waiting for don't really need much.

Street Fighter 6 will run with a GTX 1060 on high settings.

View: https://www.youtube.com/watch?v=WFwUcYve5vs


and Tekken 8 minimum requirements only need an RX 580. Although for 4k gaming, it needs a 3080 or the AMD equivalent. Well, I only game on 1080p, so I don't have to buy a new gpu for these new games. :)

View: https://www.youtube.com/watch?v=mzMqld5kr8c
 
Last edited:

KyaraM

Admirable
LOL I LOVE IT

It runs flawlessly at 4K 120fps on a 7900XTX, smooth like butter. If this is a AMD "sponsored" game even better, that only goes to prove "RT" is but a gimmick purposely built to slow down cards not from Nvidia. Funny enough the RT setting actually ads to FPS this time.... I wonder why lol
So that means that AMD sponsored games are rigged against the competition in both RT and rasterizing? Lawsuit incoming? Oh, wait. AMD. It's untouchable. I forgot.
 
  • Like
Reactions: Order 66

KyaraM

Admirable
Well after having controller connection difficulties that ended with me having to update firmware I just played it for an hour and it definitely didn't suck.

dClbpb8.jpg


The stats in the upper left is pretty much where I was the whole time... 70-80% GPU... roughly 40% VRAM.... 20% CPU...

So where's the bottleneck? :ROFLMAO: :ROFLMAO:

I had everything at Ultra (Epic) with RT on... this is a pretty small sample size so gonna check it out more tomorrow and see how the performance is... but so far it wasn't bad at all. I had one frame drop when I was climbing up a wall... but after exiting and reloading the game the 2nd time it didn't happen. So I dunno.

That was the only time frames weren't at 60 or within a couple fps. All the "OMG it's running at 30-40 fps" that I saw in online reviews today didn't happen to me.
What was individual core load? 20% CPU utilization means jack if only 2-4 cores are used. Considering that they say a 4c/8t CPU is enough for ultra settings, my guess is that only few cores actuallyget loaded at any given time. Actually, HWINFO confirmed that yesterday. 4 cores reached 80-90% utilization, while the rest went to maybe 50%. Total utilization was maybe 35% at worst. I have a 12700K.

Also, again. Those reviews were done with the Day 0 patch. On Thursday there was a Day 1 patch that improved things quite a bit. There are still weird frame drops and bugs, though.
 

Colif

Win 11 Master
Moderator
Its backwards to build a game now that only uses 4 cores/8 threads when many CPU have way more cores, spread the load out more and the GPU might not be sitting around waiting as long. Engine should scale more, though it might be a limitation of Unreal 4... i don't know, I can see engine will use as many as you throw at it.

Unreal Engine 4 has a Game Thread and a Render Thread as main, and then a few others for things such as helpers, audio, or loading. The Game Thread in Unreal Engine runs all of the gameplay logic that developers write in Blueprints and Cpp, and at the end of each frame, it will synchronize the positions and state of the objects in the world with the Render Thread, which will do all of the rendering logic and make sure to display them.
 
D

Deleted member 2838871

Guest
What was individual core load? 20% CPU utilization means jack if only 2-4 cores are used. Considering that they say a 4c/8t CPU is enough for ultra settings, my guess is that only few cores actually get loaded at any given time. Actually, HWINFO confirmed that yesterday. 4 cores reached 80-90% utilization, while the rest went to maybe 50%. Total utilization was maybe 35% at worst. I have a 12700K.

Also, again. Those reviews were done with the Day 0 patch. On Thursday there was a Day 1 patch that improved things quite a bit. There are still weird frame drops and bugs, though.

Its backwards to build a game now that only uses 4 cores/8 threads when many CPU have way more cores, spread the load out more and the GPU might not be sitting around waiting as long. Engine should scale more, though it might be a limitation of Unreal 4... i don't know, I can see engine will use as many as you throw at it.

I'll check the individual core load when I get home from work later... but regardless the game actually performed pretty well. The sky wasn't falling for me.
 

KyaraM

Admirable
Wow... my secondary system with an RTX 3070Ti and a 12100F, 16GB RAM, runs the game smoother than the 4070Ti... that's kinda annoying, lol. All Ultra with RT on and native 1080p vs the same settings with FSR on 1440p...
 
D

Deleted member 2838871

Guest
Wow... my secondary system with an RTX 3070Ti and a 12100F, 16GB RAM, runs the game smoother than the 4070Ti... that's kinda annoying, lol. All Ultra with RT on and native 1080p vs the same settings with FSR on 1440p...

What I don't understand is the comments I'm seeing with guys saying they have 12900/13900k rigs with 4090s and getting 25-40 fps.

I didn't experience that. The game needs adjustments for sure but I'm definitely not experiencing that kind of garbage performance.
 

KyaraM

Admirable
What I don't understand is the comments I'm seeing with guys saying they have 12900/13900k rigs with 4090s and getting 25-40 fps.

I didn't experience that. The game needs adjustments for sure but I'm definitely not experiencing that kind of garbage performance.
Yeah, I'm getting better performance on my 4070Ti main system as well, with and without FSR at 1440p. My 3070Ti handles ultra settings with RT reasonably well at 1080p, too, and would certainly do so at 1440p, too; I jave seen videos of people with this card running the game at 1440p without major issues. I have no idea what their issue is, honestly. Are they pre-testers and posted before the Day 1 patch went out on the 27th? That would be my only guess... but then, there are as many unique PCs as there stars in the sky. Poinpointing issues like this is notoriously hard.
 
D

Deleted member 2838871

Guest
Yeah, I'm getting better performance on my 4070Ti main system as well, with and without FSR at 1440p. My 3070Ti handles ultra settings with RT reasonably well at 1080p, too, and would certainly do so at 1440p, too; I jave seen videos of people with this card running the game at 1440p without major issues. I have no idea what their issue is, honestly. Are they pre-testers and posted before the Day 1 patch went out on the 27th? That would be my only guess... but then, there are as many unique PCs as there stars in the sky. Poinpointing issues like this is notoriously hard.

Yeah... makes sense. I just looked at Digital Foundry's review and he didn't say anything good... 12900k and it looked like it was using 1/2 the cores... with his GPU performance 40-50%.

Mine was 70-80% last night. Why the big differences? Both running 4090's.

Anyway... will follow up with my core load numbers in a few.
 

Phaaze88

Titan
Ambassador
Why the big differences?
Because of the big differences in folks' hardware specs and onboard software/drivers.
Consoles? It's all the same stuff inside - the devs will know exactly what they're working with.
PC? No, some are just going to get the short stick and there's no helping it.
I believe it that there are some out there not having issues running this game. They were more fortunate than others, or just happen to have the right combo of hardware and drivers/software.
 
D

Deleted member 2838871

Guest
Because of the big differences in folks' hardware specs and onboard software/drivers.
Consoles? It's all the same stuff inside - the devs will know exactly what they're working with.
PC? No, some are just going to get the short stick and there's no helping it.
I believe it that there are some out there not having issues running this game. They were more fortunate than others, or just happen to have the right combo of hardware and drivers/software.

... and you'd be right. I've seen more complaints than positives about Hogwart's, Last of Us, and now Jedi Survivor... but I have had virtually no issues with any of them running at 4K 60 Ultra.

Of course... I built my PC a week ago so that probably has a lot to do with it.
 
D

Deleted member 2838871

Guest
What was individual core load? 20% CPU utilization means jack if only 2-4 cores are used. Considering that they say a 4c/8t CPU is enough for ultra settings, my guess is that only few cores actually get loaded at any given time. Actually, HWINFO confirmed that yesterday. 4 cores reached 80-90% utilization, while the rest went to maybe 50%. Total utilization was maybe 35% at worst. I have a 12700K.

Its backwards to build a game now that only uses 4 cores/8 threads when many CPU have way more cores, spread the load out more and the GPU might not be sitting around waiting as long. Engine should scale more, though it might be a limitation of Unreal 4... i don't know, I can see engine will use as many as you throw at it.

Well here it is... 4K 60 Ultra/Epic settings with RT on and it was a solid 60 fps with no stuttering... even in the last pic where I climbed up the wall that stuttered yesterday.

Pretty sure it's using more than 2-4 cores. It doesn't appear that any are being parked either.

H7tyEud.jpg



eGdg6Ku.jpg


XwBKqpj.jpg


Pretty much the same as yesterday... 70-80% GPU usage... 15-20% CPU usage... 12-14GB VRAM usage... 20GB RAM usage...

It doesn't perform as badly on my system as people are claiming in reviews... no doubt about that. Frametime is under 16.67 for 4k 60 too.
 
Last edited by a moderator:
for me is another story
1440p epic settings, ray tracing off = 50-60fps, cpu utilisation about 40%, gpu utilisation about 50%
ray tracing on = 30-45 fps, cpu utilisation drops to 30%, gpu utilisation remains about same
but if i close game when ray tracing is on and open it again, it goes back to 40% cpu utilisation with 50-60fps with slightly higher gpu utilisation

i did play it for about hour and didnt saw more than 10GB vram used
 
Pre-release versions of 'Star Wars Jedi: Survivor' have revealed detrimental performance issues with the PC version. High-end GPUs like the RTX 4090 and 3080 Ti are only managing 35-50 frames per setting on average.

i just googled the price of the 4090 , average price is £1600 to £1800 ..... if a card as expensive as this struggles with the game their is not much hope for those of us ...

When issues like this occur i always wonder what the spec of the pc are that they write and test the games on
 
update: changed some settings in gameusersettings.ini and now GPU can actualy run up to 100% (rt on)/80% (rt off) with stable 60fps, while cpu usage is now higher

Code:
[ConsoleVariables]
AllowAsyncRenderThreadUpdates=1
AllowAsyncRenderThreadUpdatesDuringGamethreadUpdates=1
AllowAsyncRenderThreadUpdatesEditor=1

it is playable now
vlhdRuD.jpg
 
Last edited:

KyaraM

Admirable
Well here it is... 4K 60 Ultra/Epic settings with RT on and it was a solid 60 fps with no stuttering... even in the last pic where I climbed up the wall that stuttered yesterday.

Pretty sure it's using more than 2-4 cores. It doesn't appear that any are being parked either.

H7tyEud.jpg



eGdg6Ku.jpg


XwBKqpj.jpg


Pretty much the same as yesterday... 70-80% GPU usage... 15-20% CPU usage... 12-14GB VRAM usage... 20GB RAM usage...

It doesn't perform as badly on my system as people are claiming in reviews... no doubt about that. Frametime is under 16.67 for 4k 60 too.
Interesting. And weird. My 12700K definitely behaves differently. Is this another instance of optimization for only AMD, maybe? That doesn't make much sense, though... but I do notice that only your lower cores (CCD1?) are used. So there does seem to be some parking going on.

for me is another story
1440p epic settings, ray tracing off = 50-60fps, cpu utilisation about 40%, gpu utilisation about 50%
ray tracing on = 30-45 fps, cpu utilisation drops to 30%, gpu utilisation remains about same
but if i close game when ray tracing is on and open it again, it goes back to 40% cpu utilisation with 50-60fps with slightly higher gpu utilisation

i did play it for about hour and didnt saw more than 10GB vram used
I think GameStar (German magazine) noticed something similar. While you can switch settings on the fly, you lose about 10 FPS no what which "direction" (turning on or turning off) you go. This sounds like that bug they mentioned.

i get that 30-40fps if i turn off ray tracing and then turn it back on, cpu usage drops, fps drops, gpu usage remains about same

EA version of game (not steam), 130GB game size
I have no idea why they changed the storage to 155GB in the official requirements, honestly. Game is the same size everywhere.

update: changed some settings in gameusersettings.ini and now GPU can actualy run up to 100% (rt on)/80% (rt off) with stable 60fps, while cpu usage is now higher

Code:
[ConsoleVariables]
AllowAsyncRenderThreadUpdates=1
AllowAsyncRenderThreadUpdatesDuringGamethreadUpdates=1
AllowAsyncRenderThreadUpdatesEditor=1

it is playable now
vlhdRuD.jpg
Interesting. Will have to check that out on my main system when I get home. Maybe I test it on the second system as well.

EDIT: I think I located the file. [...]AppData\Local\SwGame\Saved\Config\WindowsNoEditor correct?
 
Last edited:
Interesting. Will have to check that out on my main system when I get home. Maybe I test it on the second system as well. So from the name of the variables, it seems like they messed up settings that are useful on console, but probably not that much for computer?
there are still some instances where fps goes garbage, while CPU usage is full 8 cores...like game ignores walls and renders whatever is behind it
 

Colif

Win 11 Master
Moderator
No 2 PC are the same, even if they have exact same parts. So this forum is full of people posting cause they watched a video and can't match performance on their PC. It happens.

Just thought I mention that as sometimes you just have to be happy with what you get
 
  • Like
Reactions: Phaaze88

KyaraM

Admirable
there are still some instances where fps goes garbage, while CPU usage is full 8 cores...like game ignores walls and renders whatever is behind it
Yeah, tested it on the second system just now. Koboh is quite smooth, except for one or two spots. But temperatures went down it seems. Yesterday GPU core hit 85°C at least once, today it went no higher than 75°C. Same for CPU. No idea if it's related, though. But GPU utilisation did stay over 90% at least, so probably?
 
No 2 PC are the same, even if they have exact same parts. So this forum is full of people posting cause they watched a video and can't match performance on their PC. It happens.

Just thought I mention that as sometimes you just have to be happy with what you get
ye well, so far i havent any issue with vram usage in this game on 1440p epic..weird right?
just for gigles disabled textures streaming which is supposed to increase vram and loading times, but it remained same in 10-12GB vram range

so i enabled texture streaming and now look at this
D56aqxG.jpg


usersettings now have this
Code:
[SystemSettings]
bForceCPUAccessToGPUSkinVerts=False
OneFrameThreadLag=1
FinishCurrentFrame=0
TextureStreaming=1
r.Streaming.UseFixedPoolSize=0
r.Streaming.PoolSize=0
r.Streaming.FullyLoadUsedTextures=1
r.Streaming.LimitPoolSizeToVRAM=0
r.Streaming.AmortizeCPUToGPUCopy=1
r.Streaming.MaxTempMemoryAllowed=16

[ConsoleVariables]
AllowAsyncRenderThreadUpdates=1
AllowAsyncRenderThreadUpdatesDuringGamethreadUpdates=1
 

KyaraM

Admirable
ye well, so far i havent any issue with vram usage in this game on 1440p epic..weird right?
just for gigles disabled textures streaming which is supposed to increase vram and loading times, but it remained same in 10-12GB vram range

so i enabled texture streaming and now look at this
D56aqxG.jpg


usersettings now have this
Code:
[SystemSettings]
bForceCPUAccessToGPUSkinVerts=False
OneFrameThreadLag=1
FinishCurrentFrame=0
TextureStreaming=1
r.Streaming.UseFixedPoolSize=0
r.Streaming.PoolSize=0
r.Streaming.FullyLoadUsedTextures=1
r.Streaming.LimitPoolSizeToVRAM=0
r.Streaming.AmortizeCPUToGPUCopy=1
r.Streaming.MaxTempMemoryAllowed=16

[ConsoleVariables]
AllowAsyncRenderThreadUpdates=1
AllowAsyncRenderThreadUpdatesDuringGamethreadUpdates=1
This is completely missing from my file oO
What's going on?