News Pre-Launch 'Jedi: Survivor' Eats 21GB of VRAM, Struggles on RTX 4090

I find these things particularly annoying. The amount of game releases that aren't fully optimized is getting to be a bit of a joke. Not only a joke but a running theme.

Game Devs release this tripe, charge mega bucks ( I remember a time not so long ago where PC games were always cheaper than the console variants) and then use us as guineapigs.

They then spend months trying to fix all the issues that have a risen from poorly executed release.

On the other hand, even with optimizations hardware requirements are steep in this.
 

_dawn_chorus_

Distinguished
Aug 30, 2017
563
56
19,090
It's less of an optimization problem and more of Nvidia being an utter cheapscake with their contemptible VRAM rationing. No new card should have less than 16gb VRAM these days, let alone a 700-plus dollar one.
That's another problem, but this is 100% an optimization issue.

"EckhartsLadder, in particular, noted that 50 fps was the maximum frame rate he was able to achieve consistently on his RTX 3080 Ti no matter what, and changing the game's graphical settings did nothing to improve performance."

If changing settings does nothing to lower vram or raise fps then it is indeed a mess.
 

PEnns

Reputable
Apr 25, 2020
702
747
5,770
Ah yes, and they said not a long time ago, 640 KB of RAM was more than anybody would ever need!!
Lo and behold, everybody who agreed was proven wrong....big time.

AMD was right stating, merely 4 weeks ago, that the lack of VRAM will bite Nvidia and (its shills') in its shortsighted but very greedy butt!!

I have an idea: Get used to it!!!
 

Metteec

Distinguished
Jan 12, 2014
42
32
18,560
I learned my lesson after The Last of Us Part I on PC to never pre-order a single player game ever again. The list of pre-order performance fails of 2023 is so high right now: Hogwarts Legacy, Dead Space, Forspoken, Diablo IV, Wild Hearts, Jedi: Survivor, and the Last of Us Part 1. Averaging $63 each at launch, it is just too expensive to reward developers for lazy programming. There have always been a couple of poor performance AAA games each year but I do not remember so many issues since Ultima IX.

Also, this is not an issue of lack of VRAM. The 3090 and 4090 both have 24GB of VRAM and they still struggle to play this game. The issue is optimization and focusing too heavily on PS5 and XBox.
 
Also, this is not an issue of lack of VRAM. The 3090 and 4090 both have 24GB of VRAM and they still struggle to play this game. The issue is optimization and focusing too heavily on PS5 and XBox.

Exactly. This has been going for some time too, the shift of developers focusing on game development centering around consoles. It started happening with the Gen4 consoles (PS4, XB1). Perhaps the most well known one that burned PC gamers was the Batman Arkham Knight debacle. What a dumpster fire that was and by the time it was patched to fully being playable nearly a year later, people had already moved on to other games. Fortunately I got my copy for free with an EVGA GTX 970 GPU purchase (my second 970 to run SLI) - among the best GPU power for the time. Totally unplayable.
 

zx128k

Reputable
Most gamers have 8GB of VRAM, next those with more and then those with 6GB. The most common GPU is the NVIDIA GeForce RTX 3060. source

nVidia basically control most of the market. Jon Peddie Research

For decades, the market for desktop graphics cards has had two players: AMD (formerly ATI) and Nvidia. Intel decided to enter those hotly-contested waters in 2022 with its Arc series of GPUs along with similar dedicated offerings for laptops. According to a report on graphics card market share, Intel managed to grab 9 percent of dedicated desktop GPU sales by the end of the 2022 calendar year—the same amount of the market that AMD had for the same timeframe.

And 9 percent market share isn’t exactly something you parade in front of the shareholders, especially when Nvidia continues to dominate with a near-monopoly at 82 percent.

On PCWorld’s The Full Nerd podcast, Intel’s Arc spokesman Tom Petersen told us, “If you think about it, we’re one of the few companies in the world that can enter a large market like discrete graphics…Nvidia will probably continue to ignore us, but AMD cannot ignore us. And it’s going to become much more competitive over the years as we become more established.”
source

Why would a developer create a game that most gamers can't run well.
 
Last edited:
  • Like
Reactions: KyaraM
are only managing 35-50 frames per setting on average.
giphy.gif
 
D

Deleted member 2838871

Guest
I learned my lesson after The Last of Us Part I on PC to never pre-order a single player game ever again. The list of pre-order performance fails of 2023 is so high right now: Hogwarts Legacy,

Also, this is not an issue of lack of VRAM. The 3090 and 4090 both have 24GB of VRAM and they still struggle to play this game. The issue is optimization and focusing too heavily on PS5 and XBox

Last of Us and Hogwarts both play great on mine... 4K 60 Ultra.

Yet gamers will purchase this crap game and cry afterwards that their PC is struggling.

Hahah.... I got it for free with the Ryzen purchase... and Redfall that comes out in a few days I got with the 4090 purchase.

If they both suck... I don't care. :ROFLMAO: :ROFLMAO:

This is the game AMD are giving out with new 7000x (including 3D) purchases. I wonder how many players eager to test their new builds are going to be driven to despair at these framerates.

I'll follow up with my framerate. Interested to know if this game can bring my beast PC to its knees.
 
  • Like
Reactions: Why_Me and Sluggotg

CeltPC

Distinguished
Jun 8, 2017
75
55
18,610
I also will get this game free for buying my 7900X. Hogwarts Legacy plays great at 4K on my rig, so hopefully the same will hold true on this 7900X, RTX 4080, 32GB Dram PC. Apparently there will be a release day patch that might also clean things up from the pre-release version woes.