Black Myth Wukong PC benchmarks: A tour de force for Nvidia's ray tracing hardware

But.. have you actually PLAYED the game? I did and I can tell you Ray tracing ISN WORTH AT ALL. It's not "realistic" neither "better", it's actually just "different". Now cut the b
Yes, I have. What GPU are you playing it on? Because if you don't have a GPU that can run the full RT at very high, like RTX 4070, it's not worth it. If you have a top-tier Nvidia GPU, though, it looks much better in action than the non-RT stuff. Shadows don't flicker around and go blobby, water and reflections look nicer, the lighting looks more accurate (stuff gets indirectly lit and so dark areas aren't always quite as dark).

Does it make the game better? Not really, as I say multiple times. Don't just read the headline and decide you know what I'm thinking. My point in the headline is that this is a tour de force for Nvidia's ray tracing hardware. It looks better, and it only works well on Nvidia GPUs. Nothing else comes close. My point isn't that you should feel bad if you can't enable the very high RT setting and get good performance, but if you have an RTX 4070 or above? Sure, you can run it at max settings and it looks amazing and plays just fine.
Why are we calling 720p 1080p now?
Upscaling is here to stay, especially as games become more demanding. So when I say "1080p with 67% scaling via DLSS/FSR/XeSS" that's fundamentally different than "native 720p." If you don't have RTX hardware and can't run DLSS, though, I can understand why people would think upscaling isn't that great. DLSS > XeSS > FSR 2/3. The first two only cause a minor drop in image fidelity for a boost in FPS, while the last causes clear image degradation, even at the "Quality" setting.
 

JRStern

Distinguished
Mar 20, 2017
72
30
18,560
I don't even "game" but I might go for the tech just as a nerdy thing, maybe, if there were some local shops that would do some side-by-side comparisons like you describe here.

I'd probably be completely satisfied by HD (1920x) and 30fps, maybe that will work fine on most cards?
 
  • Like
Reactions: artk2219

Heat_Fan89

Reputable
Jul 13, 2020
459
232
5,090
But.. have you actually PLAYED the game? I did and I can tell you Ray tracing ISN WORTH AT ALL. It's not "realistic" neither "better", it's actually just "different". Now cut the b
I still have NO interest in RT. I have seen it on some games and I just turn it off. I prefer FPS and smooth gameplay over graphical tricks. Maybe when the hardware can actually keep up with RT with no impact on performance, then maybe i'll leave it on but by that time we'll probably talking about the RTX 8090.
 
  • Like
Reactions: artk2219
Liked this review a lot! Frame Gen is still hit and miss it seems.

I wonder what the nVIdia GPU's would run like with AMD Frame Gen? Any better or worse than nVidia's implementation. I'm asking this because as a lowly RTX3xxx series owner, I can only use AMD Frame Gen, Interesting for comparison purposes.
 
Last edited:
  • Like
Reactions: artk2219

valthuer

Upstanding
Oct 26, 2023
117
110
260
@JarredWaltonGPU I ran the game's benchmark at 4K maximum (Cinematic) settings, at TSR, with Full Ray Tracing On, Vsync Off, Full Ray Tracing Level Very High, without Frame Generation, and with Super Resolution set at 100. I only got an average of 22 FPS, with a minimum of 18 and a maximum of 27.

My rig, consists of an i9-13900K, with 64GB RAM and an RTX-4090.

Are those numbers normal for me?

Thank you in advance for your time.
 
Liked this review a lot! Frame Gen is still hit and miss it seems.

I wonder what the nVIdia GPU's would run like with AMD Frame Gen? Any better or worse than nVidia's implementation. I'm asking this because as a lowly RTX3xxx series owner, I can only use AMD Frame Gen, Interesting for comparison purposes.
So, even though the game appears to have FSR 3.1, you can't enable AMD framegen with Nvidia DLSS upscaling. That means you need to run FSR upscaling and framegen. I have tried this (testing some 30-series GPUs now...) and visually FSR right now just doesn't look good in this game. Ghosting and other artifacts, plus blurriness. You can use it, but it's fugly.

Um, RTX 3060 12GB saw 'performance' with framegen and FSR (compared to baseline DLSS) improve by 78%, so that's pretty similar to the AMD scaling from framegen. I'd suggest skipping framegen and full RT and just use DLSS with whatever settings your card can handle. Medium preset at 1080p gets ~80 fps on the 3060, while the cinematic preset gets just 28 fps. High should be around 40~45 fps.
 
@JarredWaltonGPU I ran the game's benchmark at 4K maximum (Cinematic) settings, at TSR, with Full Ray Tracing On, Vsync Off, Full Ray Tracing Level Very High, without Frame Generation, and with Super Resolution set at 100. I only got an average of 22 FPS, with a minimum of 18 and a maximum of 27.

My rig, consists of an i9-13900K, with 64GB RAM and an RTX-4090.

Are those numbers normal for me?

Thank you in advance for your time.
What is TSR? Isn't that just temporal ... something? Anyway, 4090 with DLSS only got 43.7 fps in my testing, and because the full RT is so demanding, I suspect it's getting about double native resolution performance. So yeah, I think your results are about right.
 
So, even though the game appears to have FSR 3.1, you can't enable AMD framegen with Nvidia DLSS upscaling. That means you need to run FSR upscaling and framegen. I have tried this (testing some 30-series GPUs now...) and visually FSR right now just doesn't look good in this game. Ghosting and other artifacts, plus blurriness. You can use it, but it's fugly.

Um, RTX 3060 12GB saw 'performance' with framegen and FSR (compared to baseline DLSS) improve by 78%, so that's pretty similar to the AMD scaling from framegen. I'd suggest skipping framegen and full RT and just use DLSS with whatever settings your card can handle. Medium preset at 1080p gets ~80 fps on the 3060, while the cinematic preset gets just 28 fps. High should be around 40~45 fps.
Yeah, although I've only used AMD FG in one game, Immortals of Avium, it was very glitchy. WIth just FSR or DLSS I'm getting low settings on 1080p around the 70-80 mark. With FG on, it's more like 130-150 or so, but it was more than a little stuttery here and there. I'll stick with DLSS as that's all I can do, but it's defo pushing me towards a GPU upgrade! :)
 

mhmarefat

Distinguished
Jun 9, 2013
60
64
18,610
Hello and thank you for your work.
I think this marks the first Tom's Hardware GPU review that native rendering is completely ignored (thanks to the increadibly disastrous UE5 "game" engine and Nvidia's predatory corporate power that now we must abandon common sense as well and do as corporations tell us. Just turn on fake frames because nowdays games will be a slide show without it).

The headline is extremely misleading (though the review is very informing, fair and precise as usual):
AMD and Intel GPUs will want to stick to rasterization rather than full ray tracing.
It would be fair to say:
AMD, Intel and the rest of under $1~2K GPUs, will want to stick to rasterization rather than full ray tracing.

This headline is claiming all RTX cards are capable of smoothly running this game with RT, no they are not (as the review shows it). And even those 4090 enjoyers, are subjected to unplayable gaming stutter which plagues UE5 games (and is intensified when RT is turned on. See DigitalFoundry's review of this game for more info).

Nvidia, who has lost touch with reality, is moving gaming industry to a "rich man's hobby" state. Its marketing and mouth pieces have been telling us for years now, that RT is the "future". After 8 years it still remains in the "future" apparently thanks to @valthuer's preciese HARDWARE review a few posts above:
I ran the game's benchmark at 4K maximum (Cinematic) settings, at TSR, with Full Ray Tracing On, Vsync Off, Full Ray Tracing Level Very High, without Frame Generation, and with Super Resolution set at 100. I only got an average of 22 FPS, with a minimum of 18 and a maximum of 27.

My rig, consists of an i9-13900K, with 64GB RAM and an RTX-4090.
So yeah, Nvidia's promoted and optimized game running on $2K Nvidia GPU, all fake frames disabled, is capable of average 22 FPS at hardware level. Thank you for complementary review!
 

valthuer

Upstanding
Oct 26, 2023
117
110
260
Nvidia, who has lost touch with reality, is moving gaming industry to a "rich man's hobby" state. Its marketing and mouth pieces have been telling us for years now, that RT is the "future". After 8 years it still remains in the "future" apparently thanks to @valthuer's preciese HARDWARE review a few posts above:

So yeah, Nvidia's promoted and optimized game running on $2K Nvidia GPU, all fake frames disabled, is capable of average 22 FPS at hardware level. Thank you for complementary review!

Just realised i hadn't installed the latest 560.94 Nvidia driver, so i kinda owed you an update: my minimum FPS, just went to 19 :ROFLMAO::ROFLMAO::ROFLMAO: Which, of course, makes all the difference :ROFLMAO::ROFLMAO::ROFLMAO:

 

vanadiel007

Distinguished
Oct 21, 2015
299
303
19,060
All RT does is make it look like you need to run to the store and get a 4090 right now, or your life as you know it is over.
Honestly, this is more about marketing than actual need or even progress.

If and when we eventually "need" ray tracing, GPU's will be so powerful that even a laptop can handle it.
For now I just see it as nothing more than a marketing tool to showcase what the most expensive hardware can do for you.
 
This is the type of game nVidia has needed for a while. While some say RT is not bringing much to the table and even the realistic light effect could look worse artistically, you can tell this game was built with RT in mind, since it does look better more often than not with it enabled and not a jumbled mess. CP2077's PT looks like crap to me, but in this game looks really friggen good. Well, at least in all the footage comparisons I've been watching.

I'm still sad we have to rely on upscaling for meaningful performance increases, but I guess we need a few more generations to get games like this built with RayTracing first and switch the mentality to have regular raster as the low end alternative.

I may no like nVidia as a corporation, but as I've always said: Ray Tracing is worth it, so we need to chase it. I just hope AMD gets serious about getting better RT processing in their architectures. RDNA4 is rumoured to be a seizable jump and bring things closer to RTX4K series at least.

Also, I wonder how tied is the RT implementation in these games to nVidia's custom hardware? Anyone has looked into it?

Microsoft doing a piss poor job at updating DirectX is not doing anyone any favours (EDIT: Khronos with Vulkan/OGL isn't either TBH).

EDIT: Ack, I forgot the most important thing! Thanks a lot Jarred for the data. Great review as always.

Regards.
 
If you cannot run FG separate from upscaling wouldn't that indicate it is not using FSR 3.1
Not necessarily, because the developer would actually need to program in support to use FSR 3.1 framegen with DLSS. Right now, if you have an RTX 30-series GPU and select DLSS and then try to turn on framegen, it says it's not a supported feature. But if you switch to FSR or TSR, you can enabled framegen. That TSR works with framegen suggests that it is using FSR 3.1, but the devs didn't build in a way to say "please use FSR framegen with DLSS upscaling."

I don't know if there's an easy way to prove whether a game is using FSR 3 or FSR 3.1, because it's usually compiled into the application. XeSS and DLSS have DLL files where you can easily see the revision, and the presence of XeSS 1.3 suggests the devs are pretty up to date on the upscaling libraries. I still think it looks like there's a lot of work needed to improve the FSR image quality, though... which could mean it's using 3.0, or it could just be that no tuning was done.

(There's also a sharpening filter in Unreal Engine 5 that's very overblown in Black Myth Wukong, and perhaps that's part of why FSR looks worse — a bad sharpening filter, before or after FSR upscaling, isn't likely to produce a great result.)
 
  • Like
Reactions: Makaveli
Hello and thank you for your work.
I think this marks the first Tom's Hardware GPU review that native rendering is completely ignored (thanks to the increadibly disastrous UE5 "game" engine and Nvidia's predatory corporate power that now we must abandon common sense as well and do as corporations tell us. Just turn on fake frames because nowdays games will be a slide show without it).
"Native rendering" is still just words. Games in the past have used tweaks to reduce the number of computations a GPU has to do. VRS is another way to make things run faster. If it really makes you feel better, just pretend that my testing was done at 720p, 960p, and 1440p.
The headline is extremely misleading (though the review is very informing, fair and precise as usual):

It would be fair to say:
AMD, Intel and the rest of under $1~2K GPUs, will want to stick to rasterization rather than full ray tracing.

This headline is claiming all RTX cards are capable of smoothly running this game with RT, no they are not (as the review shows it). And even those 4090 enjoyers, are subjected to unplayable gaming stutter which plagues UE5 games (and is intensified when RT is turned on. See DigitalFoundry's review of this game for more info).
Many Nvidia GPUs can provide a playable experience with full RT... it just depends on what level of RT you want to use, what level of upscaling, and what performance you'd like to achieve. RTX 3080 10GB breaks 30 fps at 1080p with the maximum RT setting, and RTX 4070 is faster than the 3080. Is 53 fps on average playable? I'd say yes, which means a $550 Nvidia GPU will suffice. Even the 4060 Ti manages 40 fps at max settings and 1080p, so technically a $400 GPU will suffice.

I did not notice the stuttering that DF references much if at all, though I haven't spent a ton of time playing (because of the time required to run all these benchmarks). Could be preview code or preview drivers to blame. The final game seems to run pretty well, with only an occasional stutter (though admittedly on a very high-end PC).
Nvidia, who has lost touch with reality, is moving gaming industry to a "rich man's hobby" state. Its marketing and mouth pieces have been telling us for years now, that RT is the "future". After 8 years it still remains in the "future" apparently thanks to @valthuer's precise HARDWARE review a few posts above:

So yeah, Nvidia's promoted and optimized game running on $2K Nvidia GPU, all fake frames disabled, is capable of average 22 FPS at hardware level. Thank you for complementary review!
A $1700 GPU, running at 4K native without framegen, at maximum settings, gets 22 fps. Which is completely unplayable I think we can all agree, ergo no one will use those settings for the time being. I think we're two generations away from a GPU that can do Black Myth's full RT at max settings at native 4K with a reasonably playable result — meaning, not the Blackwell RTX 5090, but the whatever-it's-called RTX 6090 might suffice.

And if the AI bubble doesn't pop, GPUs are very much going to be more expensive going forward. Because why make GPUs and sell them as consumer gaming cards for $500 if you can make the same GPU into an AI GPU and charge $1000+? I won't be surprised if Nvidia announces professional Blackwell GPUs before the RTX 5090 and 5080.
 
Running the game at maximum settings at native rendering is a good way to further reduce performance, if you like lower fps for whatever reason.

Guess we have now officially entered the era where DLSS/XESS/FRSS ("AI", essentially) is now not only a requirement, but choosing not to lower your quality settings and use "AI" upscaling is considered wrong...Gaming in 2024...
 
Guess we have now officially entered the era where DLSS/XESS/FRSS ("AI", essentially) is now not only a requirement, but choosing not to lower your quality settings and use "AI" upscaling is considered wrong...Gaming in 2024...
Don't put words in my mouth. I didn't say it was "wrong" to run at native, only that it will further reduce performance from what our benchmarks show. In effect, you're coming back and saying that I'm wrong for using upscaling, which is pretty hypocritical when you think about it.

Since I'm the one choosing what settings to test, I get to decide what makes sense. I'd argue that I've already gone way too far in testing full RT modes, since so many GPUs can't handle those settings, but it's interesting to see just how far we still have to go before full RT can go mainstream. But whether I test with upscaling in all cases, or only at native, or a mix of both? There's no "wrong" answer, only data to analyze.

I ran tests at three target resolutions, all with the same upscaling factor, so in general you can expect to see similar relative performance if you want to run at native. There are image quality difference between DLSS, XeSS, and FSR, but the general performance uplift each offers is pretty similar.

If you want to know how native 1440p runs, look at the 4K upscaled results, since those use 1440p and then apply FSR/DLSS/XeSS to get to 4K. Native would be slightly faster than that since there's overhead with upscaling that wouldn't be present. 4K would probably be less than half the 1440p performance. Native 1080p will be roughly on par with the 1440p upscaled results.

"Native" is just another quality knob, and it doesn't always behave in an ideal fashion. And are we talking native with DLAA, native with TAA, native with FSR, or native with XeSS? Those definitely don't all look the same, so it's not equivalent work, though DLAA generally looks the best. There are still plenty of rendering errors / anomalies even at native — different perhaps than what you get with DLSS/FSR/XeSS, but still present.

With demanding games that default to having upscaling enabled, as long as it's supported on all GPUs, I'll likely test that way going forward. Because that's how 95% of gamers will run the game in the end. Catering to the 5% with performance that's often unacceptably slow just so I can complain that a game is too demanding? No, I'd rather use settings that run better even if they look worse, and then discuss how a game looks as a related topic. And also: DLSS Quality mode mostly looks good, and so does XeSS Quality mode. I turn on DLSS when playing games for enjoyment pretty much 100% of the time if it's supported. To each their own.