So what you're actually saying is that he wasn't running the Bedrock version of Minecraft via the Microsoft Store, but instead was using the Java edition that does a completely different form of ray tracing approximations. Basically, Java edition plus shader RT stuff works differently and isn't directly comparable (in performance) to the DXR method used for Minecraft RTX.tried it on a friends system using an amd card, thats why im confused.
edit: i asked him and he said he was using shaders. also why use minecraft rt on a gpu without eather using an upscaler or shader? i know this is a test but rtx cards come with dlss out of drivers.
As to why DLSS wasn't enabled, it's because the game was Nvidia promoted and only supports DLSS. Yes, that does provide an advantage to Nvidia RTX GPUs. I already get enough *^&$% about favoring Nvidia, though, and enabling DLSS in games where it's supported and FSR 2/3 isn't just opens the door wide to all sorts of abuse — just as only enabling FSR 2/3 where it's supported would open up the door to abuse.
Imagine a test suite that "randomly" chooses only games that support DLSS! Or a suite that only has AMD promoted games with FSR 2 or later. So, my decision is to only show upscaling performance for all GPUs if a game happens to support at least FSR 2/3 and DLSS 2 or later, and ideally it would also support XeSS 1.2 or later.