Black Myth Wukong PC benchmarks: A tour de force for Nvidia's ray tracing hardware

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
My system basics are 11900K, RX7800XT, 32GB RAM running (the game benchmark) off "SSD" as recommended. My 'low' was 79, 'high' well up in the 110's. average ~95 FPS. Temps well controlled but since the bench is so short that probably isn't the full story there. This was at 1440P on the default settings which were listed as "very high".

My own experience is the game looks really slick if not groundbreaking. The aspect that I was impressed by was how smooth everything was. Even looking over at the edges of the monitor the motion was slick as butter.
 
My system basics are 11900K, RX7800XT, 32GB RAM running (the game benchmark) off "SSD" as recommended. My 'low' was 79, 'high' well up in the 110's. average ~95 FPS. Temps well controlled but since the bench is so short that probably isn't the full story there. This was at 1440P on the default settings which were listed as "very high".

My own experience is the game looks really slick if not groundbreaking. The aspect that I was impressed by was how smooth everything was. Even looking over at the edges of the monitor the motion was slick as butter.
You think a 145 seconds long benchmark is "short"? Because that's on the longer end of the built-in benchmark range. The only things with a built-in test that immediately come to mind as being longer are Horizon Zero Dawn (175 seconds), GTAV (about six minutes with five different scenes and loading screens in between each), and RDR2 (basically the same as GTAV).

Anyway, if you just want to play the game, the recommended settings might be fine. If you want to provide data to others, I'd list the exact settings used — what was scaling set to, is framegen enabled, was RT off, and was the global preset on "very high" or is that just some of the options? Averaging 95 fps means your rig can easily handle the selected settings, but I also suspect you have 50% scaling and framegen enabled.
 
  • Like
Reactions: TesseractOrion
You think a 145 seconds long benchmark is "short"? Because that's on the longer end of the built-in benchmark range. The only things with a built-in test that immediately come to mind as being longer are Horizon Zero Dawn (175 seconds), GTAV (about six minutes with five different scenes and loading screens in between each), and RDR2 (basically the same as GTAV).

Anyway, if you just want to play the game, the recommended settings might be fine. If you want to provide data to others, I'd list the exact settings used — what was scaling set to, is framegen enabled, was RT off, and was the global preset on "very high" or is that just some of the options? Averaging 95 fps means your rig can easily handle the selected settings, but I also suspect you have 50% scaling and framegen enabled.

Well, it is important to note here that I started the benchmark, let it do it's shaders, and away I went with it. So, basically witness to the settings the game dev consider as a proper configuration as default based on such.

Why would one consider that I am being disingenuous using "default" settings?

Did you include that metric in your tests, or did it not come back with the results you wanted to stand on the soapbox for?

And yes, since I use an AIO cooler solution a couple of minutes isn't long enough to saturate that cooler, nor for case temps to rise enough for the other components to level out.

I guess the point here, in my eyes, would be who was this benchmark for presented as it is showing far lower results than a real world situation where the user installs it, runs at native resolution and settings, and doesn't modify things for this custom...what did you call it..."rasterization only" result?
If anything I feel like you should check your own motivation more than trying to point out faults I made in the out of box experience.

.02
 
  • Like
Reactions: TesseractOrion
Well, it is important to note here that I started the benchmark, let it do it's shaders, and away I went with it. So, basically witness to the settings the game dev consider as a proper configuration as default based on such.

Why would one consider that I am being disingenuous using "default" settings?

Did you include that metric in your tests, or did it not come back with the results you wanted to stand on the soapbox for?

And yes, since I use an AIO cooler solution a couple of minutes isn't long enough to saturate that cooler, nor for case temps to rise enough for the other components to level out.

I guess the point here, in my eyes, would be who was this benchmark for presented as it is showing far lower results than a real world situation where the user installs it, runs at native resolution and settings, and doesn't modify things for this custom...what did you call it..."rasterization only" result?
If anything I feel like you should check your own motivation more than trying to point out faults I made in the out of box experience.

.02
I wasn't saying you were being disingenuous, just that you didn't state exactly what settings were used. That's helpful information for others is all. I did list exactly what I used in my testing. Based on your PC specs and results, I can make an educated guess as to how that compares to my results, but I don't know for certain if that guess is correct.

I initially said you might have framegen on, but that was probably wrong. The game can also sometimes do odd things with scaling, depending on what resolution and scaling value you had before changing resolution. If you go from 4K to 1440p, you end up with one scaling value, while if you go from 1080p to 1440p you get a different value. It's all a bit confusing.

The RX 7800 XT got 45 fps at cinematic settings in my testing, with 67% scaling and without framegen. Dropping to very high nets perhaps another 15~20 percent, and a higher scaling factor (50%) could add another 40~60 percent. That's probably what you ended up with by using the default recommendations. Framegen can tack on another 75~85 percent, and since you say you scored 79 fps, framegen probably wasn't enabled.

But other settings could also get you ~80 fps. That's why I wanted to check what settings were actually in use. My comment wasn't an attack on your score, or an attempt to make an AMD GPU look worse, or any other motivation other than trying to determine precisely what settings you got and then consider how that compares to the score I got with a different CPU. 🤷‍♂️

$0.02.

PS: As I discussed extensively in the writeup, quoting framegen numbers without talking about what that actually means to the end user experience is very much a marketing tactic. Nvidia loves to do that, because it makes everything look (in a chart) way faster. But 100 fps with framegen is not the same experience as 100 fps without framegen. It may end up looking similar to the viewer, but responsiveness is lower due to the added latency. And of course, people don't experience latency with the built-in benchmark, so it's kind of the best-case scenario in that regard for running framegen. But that's an entirely separate topic.

And if you're wondering about the lengthy response, consider that I spent much of the past week running (and rerunning after un update arrived!) benchmarks in this game. Like, each GPU took around 90 minutes to test. I'm both sick of it but also curious to see what others think of it. I like data, basically, and getting (and giving) more information than necessary is perhaps a bad habit of mine.
 
Yes, I have. What GPU are you playing it on? Because if you don't have a GPU that can run the full RT at very high, like RTX 4070, it's not worth it. If you have a top-tier Nvidia GPU, though, it looks much better in action than the non-RT stuff. Shadows don't flicker around and go blobby, water and reflections look nicer, the lighting looks more accurate (stuff gets indirectly lit and so dark areas aren't always quite as dark).

Does it make the game better? Not really, as I say multiple times. Don't just read the headline and decide you know what I'm thinking. My point in the headline is that this is a tour de force for Nvidia's ray tracing hardware. It looks better, and it only works well on Nvidia GPUs. Nothing else comes close. My point isn't that you should feel bad if you can't enable the very high RT setting and get good performance, but if you have an RTX 4070 or above? Sure, you can run it at max settings and it looks amazing and plays just fine.

Upscaling is here to stay, especially as games become more demanding. So when I say "1080p with 67% scaling via DLSS/FSR/XeSS" that's fundamentally different than "native 720p." If you don't have RTX hardware and can't run DLSS, though, I can understand why people would think upscaling isn't that great. DLSS > XeSS > FSR 2/3. The first two only cause a minor drop in image fidelity for a boost in FPS, while the last causes clear image degradation, even at the "Quality" setting.
The audacity of SCALING the game at 67% and misleading people about resolutions is breathtaking...

Kudos genius, you just made Steve and Steve look professionals...

Matter of fact, the 4090 run this game at 70 FPS with RT on at 810p...
 
  • Like
Reactions: TesseractOrion
This is another one of those Crisis moments. They made the game super demanding to "showcase" graphical technology.
Hopefully they did not forget the gaming fun when they designed it.
Absolutely not in the same category. Crysis was literally a revolution, this is just another Nvidia sponsored title using Gameworks and crippling performances over underwhelming use of RT.
 
  • Like
Reactions: TesseractOrion
Don't put words in my mouth. I didn't say it was "wrong" to run at native, only that it will further reduce performance from what our benchmarks show. In effect, you're coming back and saying that I'm wrong for using upscaling, which is pretty hypocritical when you think about it.

Since I'm the one choosing what settings to test, I get to decide what makes sense. I'd argue that I've already gone way too far in testing full RT modes, since so many GPUs can't handle those settings, but it's interesting to see just how far we still have to go before full RT can go mainstream. But whether I test with upscaling in all cases, or only at native, or a mix of both? There's no "wrong" answer, only data to analyze.

I ran tests at three target resolutions, all with the same upscaling factor, so in general you can expect to see similar relative performance if you want to run at native. There are image quality difference between DLSS, XeSS, and FSR, but the general performance uplift each offers is pretty similar.

If you want to know how native 1440p runs, look at the 4K upscaled results, since those use 1440p and then apply FSR/DLSS/XeSS to get to 4K. Native would be slightly faster than that since there's overhead with upscaling that wouldn't be present. 4K would probably be less than half the 1440p performance. Native 1080p will be roughly on par with the 1440p upscaled results.

"Native" is just another quality knob, and it doesn't always behave in an ideal fashion. And are we talking native with DLAA, native with TAA, native with FSR, or native with XeSS? Those definitely don't all look the same, so it's not equivalent work, though DLAA generally looks the best. There are still plenty of rendering errors / anomalies even at native — different perhaps than what you get with DLSS/FSR/XeSS, but still present.

With demanding games that default to having upscaling enabled, as long as it's supported on all GPUs, I'll likely test that way going forward. Because that's how 95% of gamers will run the game in the end. Catering to the 5% with performance that's often unacceptably slow just so I can complain that a game is too demanding? No, I'd rather use settings that run better even if they look worse, and then discuss how a game looks as a related topic. And also: DLSS Quality mode mostly looks good, and so does XeSS Quality mode. I turn on DLSS when playing games for enjoyment pretty much 100% of the time if it's supported. To each their own.
Native is the ONLY benchmark!

All the rest is just misleading.

67% of 1080p is 720p! NOT 1080p!
 
It feels like there's been too much focus on image fidelity and realism. It's what the masses appear to want too, but fluidity has 'reset', or sorts; sure are a lack of triple digit bars out there with recent titles...
Personally, image fluidity comes first - no added latency please - then fidelity afterwards.

Seeing the double digit results - even out of high end hardware - these days has been... I don't know, but it doesn't feel good, man...
 
So I've added a few RTX 30-series and RX 6000-series GPUs. In both cases, the older models tend to underperform relative to the current generation. Whether that's because the new architectures are simply better equipped to deal with Unreal Engine 5 or something else is difficult to say, but the game certainly seems to prefer newer hardware.
 
  • Like
Reactions: mhmarefat
The audacity of SCALING the game at 67% and misleading people about resolutions is breathtaking...

Kudos genius, you just made Steve and Steve look professionals...

Matter of fact, the 4090 run this game at 70 FPS with RT on at 810p...
67% is standard DLSS quality that is why it was chosen and this is what majority of people will aim for... because without it this game is literal slideshow (for example like Remnant2, it too was made with upscaling in mind from the beginning unfortunately)... even though turning it on may look worse for vendors who did not pay the optimization tax...

If you read the review fully, it is not misleading as it fully explains the consequences of "fake frames" and even mentions who is really trying to mislead:
If you want to create a 4K chart that makes it look like a viable option, the solution is simple. First, use higher levels of upscaling — the game normally would use about a 5X upscaling factor at 4K. Then turn on framegen. Then you can show the RTX 4070 Ti Super hitting 66 fps like Nvidia does.
 
  • Like
Reactions: adbatista
@JarredWaltonGPU Most of the graphical advances have been to give higher visual fidelity to the environment. Is there any indication that the actions (fighting/shooting/etc) themselves will get better fidelity?

Fighting/shooting games never resonated with me because the action never seemed real. There's no actual physical contact--hits that rebounded or bodies recoiling from impact. The fighting moves are more or less canned, just like the old Karateka of yore.

Wukong is a prime example. His staff swings always complete their arc, and the only way to tell if a hit connected is from the baddie's HP bar getting shorter, or maybe some blood splatter appear.

Next in my immersion factor is the characters. Their bodies get more detailed textures--the rippling hair on the giant wolf boss looks great--but their movement/action aren't "real" either. They're not connected to the environment. Wukong walking/running just "float" over the ground. It's like "green screen" fighting in a movie.

Environmental fidelity is the least important for me, at least in regards to fighting games. If we can get, say, a Neo vs Morpheus fight (sans wire-fu), with actual physical contact, I dare say that not many would mind having no environment whatsoever, let alone something as trivial as light being refracted by moving water.
 
@JarredWaltonGPU Most of the graphical advances have been to give higher visual fidelity to the environment. Is there any indication that the actions (fighting/shooting/etc) themselves will get better fidelity?

Fighting/shooting games never resonated with me because the action never seemed real. There's no actual physical contact--hits that rebounded or bodies recoiling from impact. The fighting moves are more or less canned, just like the old Karateka of yore.

Wukong is a prime example. His staff swings always complete their arc, and the only way to tell if a hit connected is from the baddie's HP bar getting shorter, or maybe some blood splatter appear.

Next in my immersion factor is the characters. Their bodies get more detailed textures--the rippling hair on the giant wolf boss looks great--but their movement/action aren't "real" either. They're not connected to the environment. Wukong walking/running just "float" over the ground. It's like "green screen" fighting in a movie.

Environmental fidelity is the least important for me, at least in regards to fighting games. If we can get, say, a Neo vs Morpheus fight (sans wire-fu), with actual physical contact, I dare say that not many would mind having no environment whatsoever, let alone something as trivial as light being refracted by moving water.
Yeah, the "physics" of fighting is often almost totally absent in games. I'm sure part of it is because it's more difficult to handle all of the potential animations, but I suspect a big factor is just the pursuit of "fun" according to the goals of the game designers.

If every punch, kick, slash, gunshot, etc. caused a proper reaction, the player would often end up staggered and would probably be irritated. There's also the whole "this boss has a ton of hitpoints and it's going to take you forever to whittle them down" element of games, plus jumping into the air where in the real world you would lose all control.

It would be interesting to see a physically accurate simulation of martial arts and fighting in a game, but ultimately things need to be entertaining and not just realistic. It would definitely be a major paradigm shift in how the games play, and people would need to adapt to different rules beyond button mashing, but it could be interesting and fun in its own right. Of course, then we'd be back to the "git gud" mentality, as learning to deal with realistic physics in a game could potentially become as difficult as learning to do martial arts in the real world. 😛
 
So I've added a few RTX 30-series and RX 6000-series GPUs. In both cases, the older models tend to underperform relative to the current generation. Whether that's because the new architectures are simply better equipped to deal with Unreal Engine 5 or something else is difficult to say, but the game certainly seems to prefer newer hardware.
Since the image quality is dramatically different for FSR3 and DLSS at 67%, is it possible for us to have a reasonable “iso quality” comparison?

At what percentage does FSR3 looks like DLSS 67%? Or at what percentage does DLSS looks like FSR3 at 67%?

And how about running XeSS for AMD cards? I’ve found some recommendations to use XeSS there.

I have never find a (for example) DLSS 50% vs FSR 67% vs XeSS 60% performance chart anywhere.
 
"Native rendering" is still just words. Games in the past have used tweaks to reduce the number of computations a GPU has to do. VRS is another way to make things run faster. If it really makes you feel better, just pretend that my testing was done at 720p, 960p, and 1440p.
But we can't do that unless all cards use the same scaler. Especially when comparing image quality. Because instead of directly comparing RT vs non RT we're actually comparing RT+DLSS vs RT+FRS. RT vs non RT can only be compared at native resolution without scaling, otherwise it's a methodology error.

It also misses the fact that using RT now is garbage and can't be implemented without aggressive scaling and frame generation even on a $2000 adapter, so it's completely pointless. The 4090 results for CP2077 22-26 fps avg (my previous build was dqhd 13900ks+4090+PT no scale no fg) and what was mentioned in the comments above only confirms this.
Also UE5 looks great with Lumen GI, your shadow flickering may be scaling artifacts. I played Robocop RC on cinematic quality and it was great both in fps and image quality
 
But we can't do that unless all cards use the same scaler. Especially when comparing image quality. Because instead of directly comparing RT vs non RT we're actually comparing RT+DLSS vs RT+FRS. RT vs non RT can only be compared at native resolution without scaling, otherwise it's a methodology error.

It also misses the fact that using RT now is garbage and can't be implemented without aggressive scaling and frame generation even on a $2000 adapter, so it's completely pointless. The 4090 results for CP2077 22-26 fps avg (my previous build was dqhd 13900ks+4090+PT no scale no fg) and what was mentioned in the comments above only confirms this.
Also UE5 looks great with Lumen GI, your shadow flickering may be scaling artifacts. I played Robocop RC on cinematic quality and it was great both in fps and image quality
Native will flicker due to how modern render techniques works. You really need TAA to fix that and at that point DLSS/FSR/XeSS/TSR are better than other TAA solutions and gives you sometimes better that native results.

From a mathematical standpoint quality mode TAAU using 4-8 jittered frames have more raw pixel count than native.
 
>If every punch, kick, slash, gunshot, etc. caused a proper reaction, the player would often end up staggered and would probably be irritated. There's also the whole "this boss has a ton of hitpoints and it's going to take you forever to whittle them down" element of games, plus jumping into the air where in the real world you would lose all control.

What I'm hoping to see is more realistic physics (or frankly any modicum of physics), not realistic damage. There still can be some layer of damage abstraction via HPs/etc.

More to the point, I think more effort should be given to improve what's important, which for fighting games are the fighting and the characters.

Environmental realism is good for mood, but it's a peripheral factor. That, and the cost of getting "better" env (read: more RT) is long past the point of diminishing return.

Below is a good example of I mean. It's an indie game called Hellish Quart, with real-physics swordfighting. This game also has realistic damage, but as said above, real physics doesn't need to equate to real damage.

View: https://www.youtube.com/watch?v=3-2wFTRaLCQ

In any case, thanks for the in-depth look at Wukong graphics, and for allowing me to bend your ear.
 
Native will flicker due to how modern render techniques works. You really need TAA to fix that and at that point DLSS/FSR/XeSS/TSR are better than other TAA solutions and gives you sometimes better that native results.

From a mathematical standpoint quality mode TAAU using 4-8 jittered frames have more raw pixel count than native.
I'm going to respond regarding this to the thread, rather than PM, because it's easier for me to attach images, and others might like this as well.

In this particular game, it's really difficult to compare image fidelity between DLSS, TSR, FSR, and XeSS while at the same time separating optimizations from one API vs lack of optimizations for another API. Performance probably wouldn't change, but finer tuning of the algorithm could definitely improve the outputs.

I think probably the closest match quality wise if you're looking at the various scalers would be to set FSR, XeSS, and TSR to 100% versus Nvidia's DLSS at Quality (~67%) scaling. But even that's not entirely correct, because there are still differences. Honestly, it feels like the FSR and XeSS integrations suffer from lack of attention rather than that the algorithms are bad.

BMW (Black Myth Wukong) uses Streamline, Nvidia's API that allows a dev to support just that API and it will add in DLSS, FSR, and XeSS support. (I'm probably oversimplifying.) And then Nvidia as a promoter/sponsor of BMW comes in and does further game optimizations to make sure DLSS looks decent, but it doesn't do that work for FSR and XeSS.

Here are four image quality shots taken at 1080p from a 100Mbps capture (so as close to lossless while being in motion as I can get).

DLSS at 67% (which does "Quality" mode):
Black Myth Wukong vidshot 1080p 67scaling (1) DLSS.png

TSR (Unreal Engine 5's Temporal Super Resolution) at 67%:
Black Myth Wukong vidshot 1080p 67scaling (2) TSR.png

FSR at 67%:Black Myth Wukong vidshot 1080p 67scaling (3) FSR.png

XeSS at 67%:
Black Myth Wukong vidshot 1080p 67scaling (4) XeSS.png

Again, subjectively, viewing the game in motion, DLSS looks better than the others. The stills don't capture everything that you notice in frame to frame motion, but DLSS has a good blend of detail and lack of oversharpening. TSR has a lot of sharpening and this causes frame to frame shimmering. FSR's biggest issue is ghosting, as well as some pixelation and shimmering that you only see in motion. XeSS looks really blurry compared to the other three and frankly looks like a very half-baked effort on integration, because I've seen XeSS 1.3 look really nice in some other games.
 
Shower thoughts: all the effort that was put in SLI/XFire or any multi-GPU tech is now being put in upscalers instead.

As resources are finite, I wonder which tech is better for the consumer in the long run? I'm inclined to think upscalers, but it irks me that is, or may be, the right answer 🙁

Regards.
 
Guess we have now officially entered the era where DLSS/XESS/FRSS ("AI", essentially) is now not only a requirement, but choosing not to lower your quality settings and use "AI" upscaling is considered wrong...Gaming in 2024...
"Native" is not native in the first place. TAA isn't native rendering, that's why DLSS in the majority of cases ends up looking better. It doesn't lower your quality settings and whoever says that is arguing in bad faith.

There are obviously cases where DLSS doesn't look better than "native" but there are also cases where the opposite is true. But in every single case DLSS increases your framerates. Which basically means you can render at higher resolution than "native" and then use DLSS. The end result is, DLSS will ALWAYS look better than native (DLDSR + DLSS). If you are just using DLSS with no DLDSR then you are always getting a higher framerate and a 50/50 chance to get a better image quality over TAA. So in other words, you should always use DLSS even if your card can easily handle the games you are playing.
 
  • Like
Reactions: mikeztm
I grabbed the game. While the graphics are nice, it's not my cup of tea and I find it hard to play, even with a controller.
Maybe I am getting to old for this stuff...
 
I grabbed the game. While the graphics are nice, it's not my cup of tea and I find it hard to play, even with a controller.
Maybe I am getting to old for this stuff...
I laughed when there was some note at the start about the game being best with a controller. Screw that! I've played quite a bit and it definitely works well with keyboard and mouse. I'm not sure what the devs were thinking when they put that message in about using a controller. (Or maybe I'm thinking of Star Wars Outlaws? Same story though!)