News Remnant II Devs Designed Game With DLSS/FSR 'Upscaling In Mind'

emike09

Distinguished
Jun 8, 2011
161
155
18,760
There are many better-looking games that run much smoother. I was excited to play a full UE5 title so thought I'd give it a try, but ended up refunding it (for now). Nanite sure does make objects look great, but shadows are a joke and constantly flicker, especially in cutscenes, and the lighting is flat and boring. 30-40fps in native 4K on my 4090 is unacceptable for a game that looks like it came out 5 or 6 years ago. I guess I see why they skimped out on Lumen or other RT rendering. I also hoped the UE5 engine would have overcome poor development choices. I'm not a developer so I can't speak much towards optimization, but this is ridiculous.
 
And this is why I've always said any upscaling for PC games is stupid. Keep console stuff in consoles.

Enjoy crappier and crappier performance in games thanks to parroting DLSS (and even FSR/XeSS) being a "feature". Talk about drinking the whole koolaid bottle and then some more.

Cynicism aside, I hope people does not give them a pass for that and actually pushes back saying "yeah, no".

Regards.
 
"To hit 60 fps at 1080p ultra settings — a configuration modern 60-class cards can achieve with most titles — Owned had to jump up from the RTX 2060 to an RTX 3080 12GB."

I know it is an auto correct typo but found it funny going to start calling him Daniel Owned now ;)
 
Apr 1, 2020
1,487
1,157
7,060
Translation: In 6 years GPUs have only about doubled in speed (4090 doesn't count as it's a Titan class), so since they've stalled you need to reduce your IQ to play our game, and since nobody wants to admit they reduce their IQ in 2023, you must use "upscaling techniques" which allow you to say you're using max settings.
 
  • Like
Reactions: lordmogul

mhmarefat

Distinguished
Jun 9, 2013
43
44
18,560
RT and nvidia are to blame for this freak show not only for this game but all other modern poorly optimized games that have made jokes of themselves by completely hiding behind upscaling tech. DLSS/FSR was supposed to make so called "RT" performance tax acceptable yet 6 years later games are releasing without "RT" but keeping the upscale clown show without bothering truly optimizing.. thank you nvidia (and all those who supported nvidia's greed show). The best part? these billionaire companies are now telling us "RT" was all an illusion. "Path Tracing" is the real "RT" so yeah.. be prepared for $10k graphics cards if you want the "RT".. for the rest of us though, all our games are being ruined. RT or not.
 
Jul 11, 2023
11
8
15
This, what was feared happened.
Devs are now using DLSS and frame generation as clutch instead of optimizing.
And what's wrong with what developers use on upscalers?
What's wrong with upscalers?
99.9% of the content People consume with loss of quality (mp3, mp4,jpeg....)tons of content, and then suddenly some nerds decided to heighten DLSS, the flaws of which need to be examined under a microscope.
You have figured out for yourself how it should be, without taking into account all the nuances in the production of chips in 2023 and stagnation in the gaming industry.
But when it comes to ordinary gamers that the heyting of the 4000 series was largely unjustified, they will quickly begin to shut up such "specialists" as you.
 

umeng2002_2

Commendable
Jan 10, 2022
193
174
1,770
At 1440p and DLSS quality, you can still tell when it's on. It's not nearly as bad at simple bi-linear scaling; but it's not imperceptible. In a lot of titles that I use DLSS, I'm still injecting AMD CAS with ReShade to bring the sharpness back up to near native. Add to that that a lot of games aren't adjusting their texture mips LOD bias with DLSS or other upscalers engaged, and you start to have an even more compromised experience.

Look at the Witcher 3 RT updates. They bloody just used a DX11 to 12 wrapper, and told people to just use frame generation because of the CPU penalty when using d3d11on12.
 
  • Like
Reactions: lordmogul