Black Myth Wukong PC benchmarks: A tour de force for Nvidia's ray tracing hardware

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

punkncat

Polypheme
Ambassador
My system basics are 11900K, RX7800XT, 32GB RAM running (the game benchmark) off "SSD" as recommended. My 'low' was 79, 'high' well up in the 110's. average ~95 FPS. Temps well controlled but since the bench is so short that probably isn't the full story there. This was at 1440P on the default settings which were listed as "very high".

My own experience is the game looks really slick if not groundbreaking. The aspect that I was impressed by was how smooth everything was. Even looking over at the edges of the monitor the motion was slick as butter.
 
  • Like
Reactions: valthuer
My system basics are 11900K, RX7800XT, 32GB RAM running (the game benchmark) off "SSD" as recommended. My 'low' was 79, 'high' well up in the 110's. average ~95 FPS. Temps well controlled but since the bench is so short that probably isn't the full story there. This was at 1440P on the default settings which were listed as "very high".

My own experience is the game looks really slick if not groundbreaking. The aspect that I was impressed by was how smooth everything was. Even looking over at the edges of the monitor the motion was slick as butter.
You think a 145 seconds long benchmark is "short"? Because that's on the longer end of the built-in benchmark range. The only things with a built-in test that immediately come to mind as being longer are Horizon Zero Dawn (175 seconds), GTAV (about six minutes with five different scenes and loading screens in between each), and RDR2 (basically the same as GTAV).

Anyway, if you just want to play the game, the recommended settings might be fine. If you want to provide data to others, I'd list the exact settings used — what was scaling set to, is framegen enabled, was RT off, and was the global preset on "very high" or is that just some of the options? Averaging 95 fps means your rig can easily handle the selected settings, but I also suspect you have 50% scaling and framegen enabled.
 

punkncat

Polypheme
Ambassador
You think a 145 seconds long benchmark is "short"? Because that's on the longer end of the built-in benchmark range. The only things with a built-in test that immediately come to mind as being longer are Horizon Zero Dawn (175 seconds), GTAV (about six minutes with five different scenes and loading screens in between each), and RDR2 (basically the same as GTAV).

Anyway, if you just want to play the game, the recommended settings might be fine. If you want to provide data to others, I'd list the exact settings used — what was scaling set to, is framegen enabled, was RT off, and was the global preset on "very high" or is that just some of the options? Averaging 95 fps means your rig can easily handle the selected settings, but I also suspect you have 50% scaling and framegen enabled.

Well, it is important to note here that I started the benchmark, let it do it's shaders, and away I went with it. So, basically witness to the settings the game dev consider as a proper configuration as default based on such.

Why would one consider that I am being disingenuous using "default" settings?

Did you include that metric in your tests, or did it not come back with the results you wanted to stand on the soapbox for?

And yes, since I use an AIO cooler solution a couple of minutes isn't long enough to saturate that cooler, nor for case temps to rise enough for the other components to level out.

I guess the point here, in my eyes, would be who was this benchmark for presented as it is showing far lower results than a real world situation where the user installs it, runs at native resolution and settings, and doesn't modify things for this custom...what did you call it..."rasterization only" result?
If anything I feel like you should check your own motivation more than trying to point out faults I made in the out of box experience.

.02
 
Well, it is important to note here that I started the benchmark, let it do it's shaders, and away I went with it. So, basically witness to the settings the game dev consider as a proper configuration as default based on such.

Why would one consider that I am being disingenuous using "default" settings?

Did you include that metric in your tests, or did it not come back with the results you wanted to stand on the soapbox for?

And yes, since I use an AIO cooler solution a couple of minutes isn't long enough to saturate that cooler, nor for case temps to rise enough for the other components to level out.

I guess the point here, in my eyes, would be who was this benchmark for presented as it is showing far lower results than a real world situation where the user installs it, runs at native resolution and settings, and doesn't modify things for this custom...what did you call it..."rasterization only" result?
If anything I feel like you should check your own motivation more than trying to point out faults I made in the out of box experience.

.02
I wasn't saying you were being disingenuous, just that you didn't state exactly what settings were used. That's helpful information for others is all. I did list exactly what I used in my testing. Based on your PC specs and results, I can make an educated guess as to how that compares to my results, but I don't know for certain if that guess is correct.

I initially said you might have framegen on, but that was probably wrong. The game can also sometimes do odd things with scaling, depending on what resolution and scaling value you had before changing resolution. If you go from 4K to 1440p, you end up with one scaling value, while if you go from 1080p to 1440p you get a different value. It's all a bit confusing.

The RX 7800 XT got 45 fps at cinematic settings in my testing, with 67% scaling and without framegen. Dropping to very high nets perhaps another 15~20 percent, and a higher scaling factor (50%) could add another 40~60 percent. That's probably what you ended up with by using the default recommendations. Framegen can tack on another 75~85 percent, and since you say you scored 79 fps, framegen probably wasn't enabled.

But other settings could also get you ~80 fps. That's why I wanted to check what settings were actually in use. My comment wasn't an attack on your score, or an attempt to make an AMD GPU look worse, or any other motivation other than trying to determine precisely what settings you got and then consider how that compares to the score I got with a different CPU. 🤷‍♂️

$0.02.

PS: As I discussed extensively in the writeup, quoting framegen numbers without talking about what that actually means to the end user experience is very much a marketing tactic. Nvidia loves to do that, because it makes everything look (in a chart) way faster. But 100 fps with framegen is not the same experience as 100 fps without framegen. It may end up looking similar to the viewer, but responsiveness is lower due to the added latency. And of course, people don't experience latency with the built-in benchmark, so it's kind of the best-case scenario in that regard for running framegen. But that's an entirely separate topic.

And if you're wondering about the lengthy response, consider that I spent much of the past week running (and rerunning after un update arrived!) benchmarks in this game. Like, each GPU took around 90 minutes to test. I'm both sick of it but also curious to see what others think of it. I like data, basically, and getting (and giving) more information than necessary is perhaps a bad habit of mine.
 
Yes, I have. What GPU are you playing it on? Because if you don't have a GPU that can run the full RT at very high, like RTX 4070, it's not worth it. If you have a top-tier Nvidia GPU, though, it looks much better in action than the non-RT stuff. Shadows don't flicker around and go blobby, water and reflections look nicer, the lighting looks more accurate (stuff gets indirectly lit and so dark areas aren't always quite as dark).

Does it make the game better? Not really, as I say multiple times. Don't just read the headline and decide you know what I'm thinking. My point in the headline is that this is a tour de force for Nvidia's ray tracing hardware. It looks better, and it only works well on Nvidia GPUs. Nothing else comes close. My point isn't that you should feel bad if you can't enable the very high RT setting and get good performance, but if you have an RTX 4070 or above? Sure, you can run it at max settings and it looks amazing and plays just fine.

Upscaling is here to stay, especially as games become more demanding. So when I say "1080p with 67% scaling via DLSS/FSR/XeSS" that's fundamentally different than "native 720p." If you don't have RTX hardware and can't run DLSS, though, I can understand why people would think upscaling isn't that great. DLSS > XeSS > FSR 2/3. The first two only cause a minor drop in image fidelity for a boost in FPS, while the last causes clear image degradation, even at the "Quality" setting.
The audacity of SCALING the game at 67% and misleading people about resolutions is breathtaking...

Kudos genius, you just made Steve and Steve look professionals...

Matter of fact, the 4090 run this game at 70 FPS with RT on at 810p...
 
This is another one of those Crisis moments. They made the game super demanding to "showcase" graphical technology.
Hopefully they did not forget the gaming fun when they designed it.
Absolutely not in the same category. Crysis was literally a revolution, this is just another Nvidia sponsored title using Gameworks and crippling performances over underwhelming use of RT.
 
Don't put words in my mouth. I didn't say it was "wrong" to run at native, only that it will further reduce performance from what our benchmarks show. In effect, you're coming back and saying that I'm wrong for using upscaling, which is pretty hypocritical when you think about it.

Since I'm the one choosing what settings to test, I get to decide what makes sense. I'd argue that I've already gone way too far in testing full RT modes, since so many GPUs can't handle those settings, but it's interesting to see just how far we still have to go before full RT can go mainstream. But whether I test with upscaling in all cases, or only at native, or a mix of both? There's no "wrong" answer, only data to analyze.

I ran tests at three target resolutions, all with the same upscaling factor, so in general you can expect to see similar relative performance if you want to run at native. There are image quality difference between DLSS, XeSS, and FSR, but the general performance uplift each offers is pretty similar.

If you want to know how native 1440p runs, look at the 4K upscaled results, since those use 1440p and then apply FSR/DLSS/XeSS to get to 4K. Native would be slightly faster than that since there's overhead with upscaling that wouldn't be present. 4K would probably be less than half the 1440p performance. Native 1080p will be roughly on par with the 1440p upscaled results.

"Native" is just another quality knob, and it doesn't always behave in an ideal fashion. And are we talking native with DLAA, native with TAA, native with FSR, or native with XeSS? Those definitely don't all look the same, so it's not equivalent work, though DLAA generally looks the best. There are still plenty of rendering errors / anomalies even at native — different perhaps than what you get with DLSS/FSR/XeSS, but still present.

With demanding games that default to having upscaling enabled, as long as it's supported on all GPUs, I'll likely test that way going forward. Because that's how 95% of gamers will run the game in the end. Catering to the 5% with performance that's often unacceptably slow just so I can complain that a game is too demanding? No, I'd rather use settings that run better even if they look worse, and then discuss how a game looks as a related topic. And also: DLSS Quality mode mostly looks good, and so does XeSS Quality mode. I turn on DLSS when playing games for enjoyment pretty much 100% of the time if it's supported. To each their own.
Native is the ONLY benchmark!

All the rest is just misleading.

67% of 1080p is 720p! NOT 1080p!
 
  • Like
Reactions: Thunder64

Phaaze88

Titan
Ambassador
It feels like there's been too much focus on image fidelity and realism. It's what the masses appear to want too, but fluidity has 'reset', or sorts; sure are a lack of triple digit bars out there with recent titles...
Personally, image fluidity comes first - no added latency please - then fidelity afterwards.

Seeing the double digit results - even out of high end hardware - these days has been... I don't know, but it doesn't feel good, man...
 
So I've added a few RTX 30-series and RX 6000-series GPUs. In both cases, the older models tend to underperform relative to the current generation. Whether that's because the new architectures are simply better equipped to deal with Unreal Engine 5 or something else is difficult to say, but the game certainly seems to prefer newer hardware.
 
  • Like
Reactions: mhmarefat

mhmarefat

Distinguished
Jun 9, 2013
60
64
18,610
The audacity of SCALING the game at 67% and misleading people about resolutions is breathtaking...

Kudos genius, you just made Steve and Steve look professionals...

Matter of fact, the 4090 run this game at 70 FPS with RT on at 810p...
67% is standard DLSS quality that is why it was chosen and this is what majority of people will aim for... because without it this game is literal slideshow (for example like Remnant2, it too was made with upscaling in mind from the beginning unfortunately)... even though turning it on may look worse for vendors who did not pay the optimization tax...

If you read the review fully, it is not misleading as it fully explains the consequences of "fake frames" and even mentions who is really trying to mislead:
If you want to create a 4K chart that makes it look like a viable option, the solution is simple. First, use higher levels of upscaling — the game normally would use about a 5X upscaling factor at 4K. Then turn on framegen. Then you can show the RTX 4070 Ti Super hitting 66 fps like Nvidia does.
 

baboma

Notable
Nov 3, 2022
256
266
1,070
@JarredWaltonGPU Most of the graphical advances have been to give higher visual fidelity to the environment. Is there any indication that the actions (fighting/shooting/etc) themselves will get better fidelity?

Fighting/shooting games never resonated with me because the action never seemed real. There's no actual physical contact--hits that rebounded or bodies recoiling from impact. The fighting moves are more or less canned, just like the old Karateka of yore.

Wukong is a prime example. His staff swings always complete their arc, and the only way to tell if a hit connected is from the baddie's HP bar getting shorter, or maybe some blood splatter appear.

Next in my immersion factor is the characters. Their bodies get more detailed textures--the rippling hair on the giant wolf boss looks great--but their movement/action aren't "real" either. They're not connected to the environment. Wukong walking/running just "float" over the ground. It's like "green screen" fighting in a movie.

Environmental fidelity is the least important for me, at least in regards to fighting games. If we can get, say, a Neo vs Morpheus fight (sans wire-fu), with actual physical contact, I dare say that not many would mind having no environment whatsoever, let alone something as trivial as light being refracted by moving water.