News Nvidia RTX 4000 and 5000 series owners instructed to lower quality settings in Hell is Us demo — Developer urges users to turn down settings and di...

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
You can blame the engine so much, devs need to take the blame here, nobody forced them to release this trash demo.

in there defense the devs cant control that nvidia hardware 4000 and 5000 sucks on this demo lol.

nvidias driver support has just been pants. there to busy into ai that the rear view mirror isnt even functioning.

by the time nvidia takes notice amd and intel will be on there door step because hopefully by then steam os will be in full swing. nvidia drivers on linux arent exactly great they have improved still can be quite bad.
 
  • Like
Reactions: Roland Of Gilead
Is this game just awfully optimized?

The graphics in this game are in no way proportional to the amount op processing power required to generate those graphics.

A 4090 should be able to do graphics like these at 4K60 or more easily.

A game that brings such a gpu to its knees should look miles better than anything else, and this game really doesn’t.
I think we've found that you can have the most mediocre graphics chug along if you saturate the frames with Ray Tracing.

Which to this day I still think is the most overrated, power sapping graphical feature on the market.

Have yet to be "wowed" with a game that has it on that didn't already do it with it off.
 
  • Like
Reactions: Lamarr the Strelok
Guess 32GB isn't enough to run it, gonna need at least 48 preferably 64.
...And I caught flack for tossing 64GB in my rig when I built my am5 system. Looks like I was ahead of the curve. 32GB didn't last long as a solid base ram spec. It was briefer than I expected and makes me eager to hit 128GB when ever I upgrade to Zen 6 as I always like to have copious amounts of ram to do other things while I game without feeling a pinch.
 
...And I caught flack for tossing 64GB in my rig when I built my am5 system. Looks like I was ahead of the curve. 32GB didn't last long as a solid base ram spec. It was briefer than I expected and makes me eager to hit 128GB when ever I upgrade to Zen 6 as I always like to have copious amounts of ram to do other things while I game without feeling a pinch.

Oh we're talking VRAM.
 
Lazy and sloppy optimization has become acceptable because of 3 points and those are each represented by different classes;
1 - Developers - The ease of pushing patches as where back in the day what you shipped was more or less final. Occasional revisions but usually it had to be right out of the gate. No incentive to do it perfectly right off, leading me into #2.

2 - End-Users - Developers know simpletons will preorder or buy a game regardless even if on a promise of tomorrow rather than what's delivered today. Also why PC gaming will eventually look like mobile gaming in that it's free shallow games riddled with micro-transactions, whales, subcategory of simpletons, will buy eventually pay for something in a "game" that cost virtually nothing to make.

3 - Hardware Manufacturers - Pointing hard at you Nvidia but also AMD. And no I'm not a AMD fanboy, I have a 4090. Pushing all the framegen crap feeds into point 1# of lazy developing and then hurts the consumer rather than benefit. Can't remember the game but you needed frame-gen to get 1080@60fps? Idiotic. Nvidia needs to stop pushing frame-gen. Improving the original is fine I guess, it's one fake frame and no different than a pull-down on TVs. But 4 fake frames to achieve perceived smoothness and not actual better performance? Nope. Stop. Back to basics. Get the drivers right, stable and importantly not over 500MB+ in size.
 
Gave the demo a try after reading this article. Honestly, the graphics suck. It feels like a UE4 game, and in no way justifies the hardware requirements specified. It screams poor optimization. I was expecting Hellblade II level graphics when I read you need a 4090 for 4K30. I can't put all the blame on the devs - UE5 really needs a massive optimization overhaul.

IMHO, Hellblade II is the finest looking game to date.

Especially at the beginning, it feels like you’re playing through a high quality cinematic.

I can recall hitting 45-55 FPS (rarely 60) with my 4090 and it still felt smooth, due to the nature of this game.
 
I like the 'look' of what UE5 engine gives, but the performance is brutal. There are other game engines out there that look just as good IMO, (ID Tech 8 for example),
Especially "Days gone" look very similar to this game. Hopefully sb does a video comparison to show you can get same graphics with 3x mor fps xD.

But yeah, idea of RT and UE5 was not to improve graphics but to shift complexity and cost in Programming from Developer to more powerful and costly GPUs of consuments.
 
wait but Nvidia always have better driver /s

somehow I have a feeling that they have been biasing too much onto the AI aspect and DLSS to a point that stability is hard to guarantee and optimization is very difficult now
 
  • Like
Reactions: Roland Of Gilead
Is this game just awfully optimized?

The graphics in this game are in no way proportional to the amount op processing power required to generate those graphics.
Even if they optimize it, ue5 is just inefficient, looks very good but not the best. For ue6 they anounced cpu multithreading xD Ue5 mostly runs only on 1-2 cores.

The 4090 recommendation for 4k60 with dlls wont change. But on my 5090 power usage is weird... It gives me 40fps native at 1440p dlaa all ultra. But it uses only around 320w power at 100% gpu usage. Cpu hovers around 20-40% per core.

With fg at fps the vram goes a bit up to round 12gb but power stays same.
 
If an intro is ripping a 4090 a new one, then its not the cards fault but the intro being over the top, yes we all like the bling intro but not when you need stupid specs to see it and as more fuel for 8GB cards, they have a place maybe at entry level 1080 gaming or older titles that players still prefer.
 
  • Like
Reactions: Lamarr the Strelok
Haha, sorry to push two videos of Daniel in the same thread, but I think this is very much related to the underlying UE5 conversation topic:

View: https://www.youtube.com/watch?v=BQKCzl1KAsI


Looks like it can work well when used correctly? Or at least, taking the time to optimise for it?

EDIT: I forgot to say the minor revisions for UE5 have brought important improvements and changes that actually increase performance and quality. UE5.4.x vs UE5.6 is quite a big jump from what I've been told and explained.

Regards.
 
Last edited:
EDIT: I forgot to say the minor revisions for UE5 have brought important improvements and changes that actually increase performance and quality. UE5.4.x vs UE5.6 is quite a big jump from what I've been told and explained.
This is true.

I had the 5.0/5.1 UE5 Matrix demo and also got a UE5 5.4 version and that had alot of improvements 5.6 should be even better.
 
Oh we're talking VRAM.
Sorry I meant to reply last night but needed to crash out. So you just got the thumbs up.

When I first read the related posts, I started off thinking you were talking VRAM. But as I read and you got to 64 GB... I thought surely I am misreading this. There is no way something can need 64 GB of VRAM. SMH. And thus my boneheaded post.

This is just insane. Nvidia might have been shorting us on VRAM, particularly non flagship gpus getting worse as you go down the stack. But for a demo to need such copious amounts of VRAM is a wee bit out of control. I suspect the devs will clean this game up before release but this sounds like some seriously sloppy optimization. I haven't played the demo yet myself to take a peak. I'll be doing that here in a bit.

Anyways thanks for the clarity. My foot didn't taste particularly great but I am always happy to learn something new. I'll post again in a bit after I check this dumpster fire? out.
 
Nvidia needs to stop pushing frame-gen. Improving the original is fine I guess, it's one fake frame and no different than a pull-down on TVs.
While I agree that it can lead to laziness in developers, we seem to be approaching a brick wall in terms of performance without that feature and without a 1000 watt video card.
 
While I agree that it can lead to laziness in developers, we seem to be approaching a brick wall in terms of performance without that feature and without a 1000 watt video card.
Thats because it's becoming so common it's normal. Take HiFi Rush. Nobody knew it was even being made. It ran on virtually everything, looked great, performed great. Doom Eternal, runs on a piece of toast with or without jelly. Tons of others that don't need to be individually listed.

Another contributer is that studios are relying more on pre-built engines than in-house engines. Canned solutions focus on one thing, uuuuuu, wowww, cost and ease of development not the overall experience. As long as it sells it's all that matters.

Again Frame-Gen assist is fine. Requiring it is not ok however. Multiplier Frame-Gen is a dangerous precedent that will make it that much more likely a requirement next or very least basic frame-gen a requirement for everything.
It's all about cost.

Just like how you can calculate today's dollar/euro/whatever to a value 20 years ago, you can do the same for games. With the technology, budget, time to develop 10-15 years ago to release a game converts mere seconds of actual effort today.

Anyway, switching out of "get off my grass" old man mode. Jensen needs to go but unlikely since he's bringing in the dollars.
 
  • Like
Reactions: blppt
Frame Gen is not performance, it's just frame interpolation, aka motion blur.
It is perceived performance. Which, if the majority of users hate, they wouldn't be pushing it.

Its all about a tradeoff---do you want stuttering, low fps, or do you want slightly laggy smoother movement?

The problem seems to be that with the power draws going up significantly with the 5xxx generation from Nvidia, for diminishing gains, that we're nearing a limit to what can be smoothness can be achieved without FSR/DLSS unless some major breakthrough is coming to GPU technology, so this is seemingly the only solution at the present time.
 
It is perceived performance. Which, if the majority of users hate, they wouldn't be pushing it.

Yes imaginary performance, like if we close our eyes we can just imagine that there are more frames being displayed. We can all raise our FPS with the power of friendship.

Seriously MFG will always be a gimmick, people see right through that BS and why nVidia played dirty games with it's review packaging recently.