[SOLVED] Can I trick my computer to think i have RTX?

akisha_009

Prominent
Mar 19, 2022
19
1
515
I have Nvidia GTX 1650 gpu, I want to use program that is called Nvidia Canvas, but for that program I need rtx gpu, is there a way to trick my computer into thinking i have rtx just to be able to open a program? i dont care if it goes like 5 fps, i just want to open the program, or if somone knows how to skip rtx check. Thanks in advance!
 

gamerbrehdy

Honorable
Jun 15, 2018
320
29
10,790
I think you can get away with a bios flash of the card. If I recall correctly, this is what they do on Wish with those sketchy 1050TI's, which are actually like GT430's or something.

But it'd probably lead up to crashes, since the card only is only just telling your system it contains the required hardware at that point, which it physically hasn't.
 

USAFRet

Titan
Moderator
I think you can get away with a bios flash of the card. If I recall correctly, this is what they do on Wish with those sketchy 1050TI's, which are actually like GT430's or something.

But it'd probably lead up to crashes, since the card only is only just telling your system it contains the required hardware at that point, which it physically hasn't.
Even if possible, that won't work.
The application is looking for specific properties from the GPU, not just a label of "RTX".
 
How would that even work? It's not an arbitrary paper requirement; it uses actual components that are present in the RTX GPUs.
Ooooohhhh, let me tell you a story about the good old times..
https://emulation.gametechwiki.com/index.php/Wrappers
Anyway, you can use a software wrapper to fool your software that the GPU hardware is there but really you emulate it on different GPU hardware or even just through CPU power (search for Swiftshader or Mesa3D) .
 
I'm sure there are ways to trick a program into thinking you have compatible hardware, but unless there's something else that converts what that application wants to do into what the hardware can actually do, it'll just crash. And as far as I know, there's nothing that can convert tensor core features into something that doesn't need tensor cores.
 
  • Like
Reactions: akisha_009

akisha_009

Prominent
Mar 19, 2022
19
1
515
I'm sure there are ways to trick a program into thinking you have compatible hardware, but unless there's something else that converts what that application wants to do into what the hardware can actually do, it'll just crash. And as far as I know, there's nothing that can convert tensor core features into something that doesn't need tensor cores.
Yes, i know it will probaby crash but i want to try something, I want to trick my program into thinking that i have RTX GPU. I know that i can't get RTX performance, but I think GTX1650 TI is not that bad.
 

DSzymborski

Curmudgeon Pursuivant
Moderator
Ooooohhhh, let me tell you a story about the good old times..
https://emulation.gametechwiki.com/index.php/Wrappers
Anyway, you can use a software wrapper to fool your software that the GPU hardware is there but really you emulate it on different GPU hardware or even just through CPU power (search for Swiftshader or Mesa3D) .

True, but it's a relatively new feature; you'd need something more powerful than the original to emulate the RTX card's features. It would be like emulating ray tracing on a GT 710.
 
True, but it's a relatively new feature; you'd need something more powerful than the original to emulate the RTX card's features. It would be like emulating ray tracing on a GT 710.
It depends on the implementation, if they make everything raytraced just show up as black or as a mesh you would get decent performance, since most games only have some ray tracing here and there it would be acceptable...but also completely pointless because why not just disable ray tracing.
If you wanted to run quake in rtx you would be out of luck.
 
True, but it's a relatively new feature; you'd need something more powerful than the original to emulate the RTX card's features. It would be like emulating ray tracing on a GT 710.
The features of the RTX cards aren't anything that needs emulation per se. The ray tracing portion can be done entirely on shader cores, which is how the GeForce 10 series can play DXR games. The stuff the tensor cores can do can also run on the shaders, but it'll be less efficient.

I mean, heck, you can run Crysis entirely off a CPU using WARP. It won't run at a blazing 60 FPS on 1080p max, but you can still render the game.
 

DSzymborski

Curmudgeon Pursuivant
Moderator
The features of the RTX cards aren't anything that needs emulation per se. The ray tracing portion can be done entirely on shader cores, which is how the GeForce 10 series can play DXR games. The stuff the tensor cores can do can also run on the shaders, but it'll be less efficient.

I mean, heck, you can run Crysis entirely off a CPU using WARP. It won't run at a blazing 60 FPS on 1080p max, but you can still render the game.

True, but this guy is talking about doing it on less powerful hardware, not more powerful hardware that's just missing official support feature.