News Nvidia RTX 4000 and 5000 series owners instructed to lower quality settings in Hell is Us demo — Developer urges users to turn down settings and di...

Is this game just awfully optimized?

The graphics in this game are in no way proportional to the amount op processing power required to generate those graphics.

A 4090 should be able to do graphics like these at 4K60 or more easily.

A game that brings such a gpu to its knees should look miles better than anything else, and this game really doesn’t.
 
Is this game just awfully optimized?

The graphics in this game are in no way proportional to the amount op processing power required to generate those graphics.

A 4090 should be able to do graphics like these at 4K60 or more easily.

A game that brings such a gpu to its knees should look miles better than anything else, and this game really doesn’t.
Well, in short, yes. They have about three months to optimize the game's graphics/performance. Between devs' internal optimizations and working with AMD and nVidia on drivers, it could very well see 20% higher fps at the same graphics settings or even far higher given the seemingly low-ish bar that's been set. It might also be a game like Cyberpunk that takes years to stable for most everywhere... we'll just hope that isn't the case.
 
Is this game just awfully optimized?

The graphics in this game are in no way proportional to the amount op processing power required to generate those graphics.

A 4090 should be able to do graphics like these at 4K60 or more easily.

A game that brings such a gpu to its knees should look miles better than anything else, and this game really doesn’t.
dropping from ulta to very high is a nice boost in performance and pretty much looks the same.
 
Historically it was AMD/ATI that had unstable drivers 😛
And that is the reason historically I went Nvidia. But now the shoe is on the other foot plus Nvidia is acting pretty horrible towards the consumer base that built them. So I'll take my chances with AMD when ever my 4090s age out.

As for these setting this is more proof we need something other that UE game engine in the main stream. Personally I'd love to see more ID software's Tech engine. It looked gorgeous in Doom the Dark Ages. Regardless the stutter fest that is UE needs to be put in check. Even the consoles have issues with frame hitches/drops. It just isn't the smoothest engine out IMHO.
 
Gave the demo a try after reading this article. Honestly, the graphics suck. It feels like a UE4 game, and in no way justifies the hardware requirements specified. It screams poor optimization. I was expecting Hellblade II level graphics when I read you need a 4090 for 4K30. I can't put all the blame on the devs - UE5 really needs a massive optimization overhaul.
 
As for these setting this is more proof we need something other that UE game engine in the main stream.

Fully agreed.

At most higher settings situations, UE5 is only viable through the use of Frame Generation.

But if you activate that, VRAM consumption is increased and you'll add the much unwanted latency.

The only properly-optimised-from-day-one UE5 game i've seen so far, is Robocop: Rogue City, which is a weird mixture of poorly designed characters and gorgeous looking surfaces.
 
Just tried the demo.

At 4K Ultra DLAA, i was getting anywhere between 46 and 62 FPS (53 most of the time).

I had to resort to a combination of DLSS Ultra Quality + Frame Generation, in order to reach an average of 75 FPS.

Sometimes, environments look great and some other times... not so much.

P.S. Game crashed first time i ran it and i had to undo the 1,000 MHz Memory Boost i gave my 5090 through the use of MSI Afterburner.
 
  • Like
Reactions: Roland Of Gilead
Fully agreed.

At most higher settings situations, UE5 is only viable through the use of Frame Generation.

But if you activate that, VRAM consumption is increased and you'll add the much unwanted latency.

The only properly-optimised-from-day-one UE5 game i've seen so far, is Robocop: Rogue City, which is a weird mixture of poorly designed characters and gorgeous looking surfaces.
Back at you. Fully agree... side note loved Robocop.

Frame gen is a last resort technology. It works mostly, to smooth things out in a pinch. But if your not hitting 100-120fps bare minimum with it on, you end up a blurry gross image that has so much lag it becomes unplayable. And even those numbers are engine specific. Anything UE I need to hit 144fps for there to be a chance of a smooth(ish) experience (cough still probably stutters cough cough). Give me Tech 8 and if I am above 100 FPS with FG and everything is golden. Plus even when the input lag is bareable, you wouldn't want to use it on an competitive game regardless of how Nvidia spins it.

I get frustrated with the input lag of x2 as a 4090 user. I can't imagine x4 MFG. It would have to feel like an old TV using frame interpolation on out of game mode to hit 240hz achieving the soap opera effect. I have seen some benchmarks where latency barely increases in select games but they seem to be the exception and not the rule. .43ms or higher didn't seem uncommon for x4 MFG. No thank you 120-144 fps is plenty. Until frame gen is latency free (never happen) or at least cost less than .05-.1 ms, I don't see it being the big deal Nvidia claims it to be.
 
Back at you. Fully agree... side note loved Robocop.

Frame gen is a last resort technology. It works mostly, to smooth things out in a pinch. But if your not hitting 100-120fps bare minimum with it on, you end up a blurry gross image that has so much lag it becomes unplayable. And even those numbers are engine specific. Anything UE I need to hit 144fps for there to be a chance of a smooth(ish) experience (cough still probably stutters cough cough). Give me Tech 8 and if I am above 100 FPS with FG and everything is golden. Plus even when the input lag is bareable, you wouldn't want to use it on an competitive game regardless of how Nvidia spins it.

I get frustrated with the input lag of x2 as a 4090 user. I can't imagine x4 MFG. It would have to feel like an old TV using frame interpolation on out of game mode to hit 240hz achieving the soap opera effect. I have seen some benchmarks where latency barely increases in select games but they seem to be the exception and not the rule. .43ms or higher didn't seem uncommon for x4 MFG. No thank you 120-144 fps is plenty. Until frame gen is latency free (never happen) or at least cost less than .05-.1 ms, I don't see it being the big deal Nvidia claims it to be.

Yeah, Rogue City is a fine game. It captures the look and sounds and overall feel of the first two RoboCop movies exceptionally well.

As for MFG x4, it helped me immensely at both Wukong and Oblivion Remastered (4K Ultra), where i would hit 120+ FPS without the use of DLSS.

But yeah, it still needs a lot of refinement before it reaches a mature level.

Until it does, i expect to see more of Nvidia's laughable claims, like 5070 being able to reach 4090's level of performance.
 
  • Like
Reactions: atomicWAR
Back at you. Fully agree... side note loved Robocop.

Frame gen is a last resort technology. It works mostly, to smooth things out in a pinch. But if your not hitting 100-120fps bare minimum with it on, you end up a blurry gross image that has so much lag it becomes unplayable. And even those numbers are engine specific. Anything UE I need to hit 144fps for there to be a chance of a smooth(ish) experience (cough still probably stutters cough cough). Give me Tech 8 and if I am above 100 FPS with FG and everything is golden. Plus even when the input lag is bareable, you wouldn't want to use it on an competitive game regardless of how Nvidia spins it.

I get frustrated with the input lag of x2 as a 4090 user. I can't imagine x4 MFG. It would have to feel like an old TV using frame interpolation on out of game mode to hit 240hz achieving the soap opera effect. I have seen some benchmarks where latency barely increases in select games but they seem to be the exception and not the rule. .43ms or higher didn't seem uncommon for x4 MFG. No thank you 120-144 fps is plenty. Until frame gen is latency free (never happen) or at least cost less than .05-.1 ms, I don't see it being the big deal Nvidia claims it to be.

With you on this I stay away from upscaling and FG.

And thankful I went 1440 UW which allows me to run everything native with enough fps. Going 4k forces upscaling and FG on even the mighty 4090 and 5090.

And for me personally I will not move up in resolution until I get a GPU that will do 4k native and not require upscaling and FG to get there.

Yeah, Rogue City is a fine game. It captures the look and sounds and overall feel of the first two RoboCop movies exceptionally well.

Rogue City is a great game I just picked it up for $12 and hope to have it done before the sequel comes out.
 
Historically it was AMD/ATI that had unstable drivers 😛
yeah.. this is patently an untruth and I challenge you to actually site a source that shows that AMD has had worse drivers. When you look at driver history Multiple crash studies have shown that GeForce actually cause more issues and have more crashes/problems

AMD has had fewer features and had black screen issues when the Navi RDNA 1 gpus came out which wasa major hardware issue actually not a driver issue but alas, the drivers were blamed and as the cards were less popular the "worse drivers" urban myth has set in.

Worse being fewer features... sure
Worse for stability/system issues... no
 
Last edited: