You think a 145 seconds long benchmark is "short"? Because that's on the longer end of the built-in benchmark range. The only things with a built-in test that immediately come to mind as being longer are Horizon Zero Dawn (175 seconds), GTAV (about six minutes with five different scenes and loading screens in between each), and RDR2 (basically the same as GTAV).My system basics are 11900K, RX7800XT, 32GB RAM running (the game benchmark) off "SSD" as recommended. My 'low' was 79, 'high' well up in the 110's. average ~95 FPS. Temps well controlled but since the bench is so short that probably isn't the full story there. This was at 1440P on the default settings which were listed as "very high".
My own experience is the game looks really slick if not groundbreaking. The aspect that I was impressed by was how smooth everything was. Even looking over at the edges of the monitor the motion was slick as butter.
You think a 145 seconds long benchmark is "short"? Because that's on the longer end of the built-in benchmark range. The only things with a built-in test that immediately come to mind as being longer are Horizon Zero Dawn (175 seconds), GTAV (about six minutes with five different scenes and loading screens in between each), and RDR2 (basically the same as GTAV).
Anyway, if you just want to play the game, the recommended settings might be fine. If you want to provide data to others, I'd list the exact settings used — what was scaling set to, is framegen enabled, was RT off, and was the global preset on "very high" or is that just some of the options? Averaging 95 fps means your rig can easily handle the selected settings, but I also suspect you have 50% scaling and framegen enabled.
I wasn't saying you were being disingenuous, just that you didn't state exactly what settings were used. That's helpful information for others is all. I did list exactly what I used in my testing. Based on your PC specs and results, I can make an educated guess as to how that compares to my results, but I don't know for certain if that guess is correct.Well, it is important to note here that I started the benchmark, let it do it's shaders, and away I went with it. So, basically witness to the settings the game dev consider as a proper configuration as default based on such.
Why would one consider that I am being disingenuous using "default" settings?
Did you include that metric in your tests, or did it not come back with the results you wanted to stand on the soapbox for?
And yes, since I use an AIO cooler solution a couple of minutes isn't long enough to saturate that cooler, nor for case temps to rise enough for the other components to level out.
I guess the point here, in my eyes, would be who was this benchmark for presented as it is showing far lower results than a real world situation where the user installs it, runs at native resolution and settings, and doesn't modify things for this custom...what did you call it..."rasterization only" result?
If anything I feel like you should check your own motivation more than trying to point out faults I made in the out of box experience.
.02
The audacity of SCALING the game at 67% and misleading people about resolutions is breathtaking...Yes, I have. What GPU are you playing it on? Because if you don't have a GPU that can run the full RT at very high, like RTX 4070, it's not worth it. If you have a top-tier Nvidia GPU, though, it looks much better in action than the non-RT stuff. Shadows don't flicker around and go blobby, water and reflections look nicer, the lighting looks more accurate (stuff gets indirectly lit and so dark areas aren't always quite as dark).
Does it make the game better? Not really, as I say multiple times. Don't just read the headline and decide you know what I'm thinking. My point in the headline is that this is a tour de force for Nvidia's ray tracing hardware. It looks better, and it only works well on Nvidia GPUs. Nothing else comes close. My point isn't that you should feel bad if you can't enable the very high RT setting and get good performance, but if you have an RTX 4070 or above? Sure, you can run it at max settings and it looks amazing and plays just fine.
Upscaling is here to stay, especially as games become more demanding. So when I say "1080p with 67% scaling via DLSS/FSR/XeSS" that's fundamentally different than "native 720p." If you don't have RTX hardware and can't run DLSS, though, I can understand why people would think upscaling isn't that great. DLSS > XeSS > FSR 2/3. The first two only cause a minor drop in image fidelity for a boost in FPS, while the last causes clear image degradation, even at the "Quality" setting.
Absolutely not in the same category. Crysis was literally a revolution, this is just another Nvidia sponsored title using Gameworks and crippling performances over underwhelming use of RT.This is another one of those Crisis moments. They made the game super demanding to "showcase" graphical technology.
Hopefully they did not forget the gaming fun when they designed it.
Native is the ONLY benchmark!Don't put words in my mouth. I didn't say it was "wrong" to run at native, only that it will further reduce performance from what our benchmarks show. In effect, you're coming back and saying that I'm wrong for using upscaling, which is pretty hypocritical when you think about it.
Since I'm the one choosing what settings to test, I get to decide what makes sense. I'd argue that I've already gone way too far in testing full RT modes, since so many GPUs can't handle those settings, but it's interesting to see just how far we still have to go before full RT can go mainstream. But whether I test with upscaling in all cases, or only at native, or a mix of both? There's no "wrong" answer, only data to analyze.
I ran tests at three target resolutions, all with the same upscaling factor, so in general you can expect to see similar relative performance if you want to run at native. There are image quality difference between DLSS, XeSS, and FSR, but the general performance uplift each offers is pretty similar.
If you want to know how native 1440p runs, look at the 4K upscaled results, since those use 1440p and then apply FSR/DLSS/XeSS to get to 4K. Native would be slightly faster than that since there's overhead with upscaling that wouldn't be present. 4K would probably be less than half the 1440p performance. Native 1080p will be roughly on par with the 1440p upscaled results.
"Native" is just another quality knob, and it doesn't always behave in an ideal fashion. And are we talking native with DLAA, native with TAA, native with FSR, or native with XeSS? Those definitely don't all look the same, so it's not equivalent work, though DLAA generally looks the best. There are still plenty of rendering errors / anomalies even at native — different perhaps than what you get with DLSS/FSR/XeSS, but still present.
With demanding games that default to having upscaling enabled, as long as it's supported on all GPUs, I'll likely test that way going forward. Because that's how 95% of gamers will run the game in the end. Catering to the 5% with performance that's often unacceptably slow just so I can complain that a game is too demanding? No, I'd rather use settings that run better even if they look worse, and then discuss how a game looks as a related topic. And also: DLSS Quality mode mostly looks good, and so does XeSS Quality mode. I turn on DLSS when playing games for enjoyment pretty much 100% of the time if it's supported. To each their own.
67% is standard DLSS quality that is why it was chosen and this is what majority of people will aim for... because without it this game is literal slideshow (for example like Remnant2, it too was made with upscaling in mind from the beginning unfortunately)... even though turning it on may look worse for vendors who did not pay the optimization tax...The audacity of SCALING the game at 67% and misleading people about resolutions is breathtaking...
Kudos genius, you just made Steve and Steve look professionals...
Matter of fact, the 4090 run this game at 70 FPS with RT on at 810p...
If you want to create a 4K chart that makes it look like a viable option, the solution is simple. First, use higher levels of upscaling — the game normally would use about a 5X upscaling factor at 4K. Then turn on framegen. Then you can show the RTX 4070 Ti Super hitting 66 fps like Nvidia does.
Yeah, the "physics" of fighting is often almost totally absent in games. I'm sure part of it is because it's more difficult to handle all of the potential animations, but I suspect a big factor is just the pursuit of "fun" according to the goals of the game designers.@JarredWaltonGPU Most of the graphical advances have been to give higher visual fidelity to the environment. Is there any indication that the actions (fighting/shooting/etc) themselves will get better fidelity?
Fighting/shooting games never resonated with me because the action never seemed real. There's no actual physical contact--hits that rebounded or bodies recoiling from impact. The fighting moves are more or less canned, just like the old Karateka of yore.
Wukong is a prime example. His staff swings always complete their arc, and the only way to tell if a hit connected is from the baddie's HP bar getting shorter, or maybe some blood splatter appear.
Next in my immersion factor is the characters. Their bodies get more detailed textures--the rippling hair on the giant wolf boss looks great--but their movement/action aren't "real" either. They're not connected to the environment. Wukong walking/running just "float" over the ground. It's like "green screen" fighting in a movie.
Environmental fidelity is the least important for me, at least in regards to fighting games. If we can get, say, a Neo vs Morpheus fight (sans wire-fu), with actual physical contact, I dare say that not many would mind having no environment whatsoever, let alone something as trivial as light being refracted by moving water.
Since the image quality is dramatically different for FSR3 and DLSS at 67%, is it possible for us to have a reasonable “iso quality” comparison?So I've added a few RTX 30-series and RX 6000-series GPUs. In both cases, the older models tend to underperform relative to the current generation. Whether that's because the new architectures are simply better equipped to deal with Unreal Engine 5 or something else is difficult to say, but the game certainly seems to prefer newer hardware.
But we can't do that unless all cards use the same scaler. Especially when comparing image quality. Because instead of directly comparing RT vs non RT we're actually comparing RT+DLSS vs RT+FRS. RT vs non RT can only be compared at native resolution without scaling, otherwise it's a methodology error."Native rendering" is still just words. Games in the past have used tweaks to reduce the number of computations a GPU has to do. VRS is another way to make things run faster. If it really makes you feel better, just pretend that my testing was done at 720p, 960p, and 1440p.
Native will flicker due to how modern render techniques works. You really need TAA to fix that and at that point DLSS/FSR/XeSS/TSR are better than other TAA solutions and gives you sometimes better that native results.But we can't do that unless all cards use the same scaler. Especially when comparing image quality. Because instead of directly comparing RT vs non RT we're actually comparing RT+DLSS vs RT+FRS. RT vs non RT can only be compared at native resolution without scaling, otherwise it's a methodology error.
It also misses the fact that using RT now is garbage and can't be implemented without aggressive scaling and frame generation even on a $2000 adapter, so it's completely pointless. The 4090 results for CP2077 22-26 fps avg (my previous build was dqhd 13900ks+4090+PT no scale no fg) and what was mentioned in the comments above only confirms this.
Also UE5 looks great with Lumen GI, your shadow flickering may be scaling artifacts. I played Robocop RC on cinematic quality and it was great both in fps and image quality
I'm going to respond regarding this to the thread, rather than PM, because it's easier for me to attach images, and others might like this as well.Native will flicker due to how modern render techniques works. You really need TAA to fix that and at that point DLSS/FSR/XeSS/TSR are better than other TAA solutions and gives you sometimes better that native results.
From a mathematical standpoint quality mode TAAU using 4-8 jittered frames have more raw pixel count than native.
"Native" is not native in the first place. TAA isn't native rendering, that's why DLSS in the majority of cases ends up looking better. It doesn't lower your quality settings and whoever says that is arguing in bad faith.Guess we have now officially entered the era where DLSS/XESS/FRSS ("AI", essentially) is now not only a requirement, but choosing not to lower your quality settings and use "AI" upscaling is considered wrong...Gaming in 2024...
I laughed when there was some note at the start about the game being best with a controller. Screw that! I've played quite a bit and it definitely works well with keyboard and mouse. I'm not sure what the devs were thinking when they put that message in about using a controller. (Or maybe I'm thinking of Star Wars Outlaws? Same story though!)I grabbed the game. While the graphics are nice, it's not my cup of tea and I find it hard to play, even with a controller.
Maybe I am getting to old for this stuff...