News Nvidia to drop CUDA support for Maxwell, Pascal, and Volta GPUs with the next major Toolkit release

It is hard to comprehend what this means in the grand scheme of things.

Are these CUDA programs things that us nominal game users need to be concerned with or is this purely about programs that use the GPU for processing other things?

I mean, it would totally suck to have a nice working 1080P system with an older GPU only to not be able to play the latest games that come out.
 
  • Like
Reactions: KyaraM
I mean, it would totally suck to have a nice working 1080P system with an older GPU only to not be able to play the latest games that come out.
Ya I'm wondering too, such as MAME, HTPC, and server applications where you might need to do some light transcoding, and buying new hardware is not necessary for the light loads. I've had to ditch old hardware due to lack of driver support, it just wasn't worth the time.
 
  • Like
Reactions: A Stoner
This is NOT a bid deal in the grand scheme of things.

CUDA is not used in games.

Current versions of CUDA apps will continue to function on the older cards.

Most CUDA apps will eventually transition to the new toolkit
When that happens, they will drop support for older cards on new releases of the program.
 
It is hard to comprehend what this means in the grand scheme of things.
Nvidia typically maintains a couple releases of CUDA. Someone could still download an older release branch of CUDA and build apps that will run on older GPUs (so long as they don't require features only found on newer ones, like Tensor cores or Ray Tracing). Also, old releases of apps will still work, because they're built on an older CUDA release.

Are these CUDA programs things that us nominal game users need to be concerned with or is this purely about programs that use the GPU for processing other things?

I mean, it would totally suck to have a nice working 1080P system with an older GPU only to not be able to play the latest games that come out.
This is mainly about AI and other GPU compute apps. I think CUDA isn't used by most games.

In general, Linux is really good at supporting legacy hardware. You can play OpenGL and even Direct3D games on some really old GPUs. However, those are standard APIs, while CUDA is something proprietary that Nvidia controls.
 
Ya I'm wondering too, such as MAME, HTPC, and server applications where you might need to do some light transcoding,
Yeah, there are a few different ways to do GPU-based transcoding, on Linux. Nvidia prefers you use their proprietary APIs, which I think do have CUDA dependencies. However, VDPAU and VAAPI are standard APIs that I expect wouldn't be affected by this change. So, whether or not it'll break your workflow (i.e. once the apps you mention transition to a newer CUDA version) probably depends on the details.
 
It is hard to comprehend what this means in the grand scheme of things.

Are these CUDA programs things that us nominal game users need to be concerned with or is this purely about programs that use the GPU for processing other things?

I mean, it would totally suck to have a nice working 1080P system with an older GPU only to not be able to play the latest games that come out.
CUDA is entirely for taking Nvidia graphics 3D power and applying them to a software program. It is never used in games, except for DLSS which is a post process. I think DLSS is plain ugly and a worthless product compared to antialiasing. CUDA is great for rendering in blender or accelerating video editing etc. Or you can write a specific computing problem around it. It has nothing to do with games and is for professionals.
I have a 1070 ti still because new graphics cards are just ridiculously expensive. I don't care about this article because I can always just use the last available toolkit and most programs that use it won't see many changes even when the toolkit sees updates.
The joke is that I will be getting an AMD card next because I just want VRAM to run a graphics library on. If I write a powerful computing script I can just get another AMD card for a 1/4th of the price of a Nvidia card that will be bogged down by its hybrid tensor cores, and those tensor cores will be out of date in AI computing in a few years. Meaning I bought a 1k+ card, and half of the graphics chip I want to use is dead weight ewaste that can only be used to generate stupid cats. Or I can just buy an AMD card with a lot of vram and wait 6 months until a good CUDA like library is available.
The problem is with these companies is they don't understand that if a consumer wants to do AI, they will be happy to drop 15k on the proper hardware to get an accurate result. If they don't want to do AI they on want to spend 600~ on a new graphics card to play games or do graphics library computations like video rendering or blender. I hope they realize the mistake but they are obsessed with tensor cores, just like the ray tracing ads when you'd have to buy the most expensive rig to bother with that.
 
  • Like
Reactions: A Stoner
CUDA is entirely for taking Nvidia graphics 3D power and applying them to a software program. It is never used in games, except for DLSS which is a post process.
I'm pretty sure DLSS doesn't use CUDA. A bit of quick searching seems to support this. If you have evidence to the contrary, please provide.

I don't care about this article because I can always just use the last available toolkit and most programs that use it won't see many changes even when the toolkit sees updates.
If you use programs that depend on CUDA, then what will happen is that they will start to use features that are only found in newer versions of the CUDA toolkit, and will no longer compile with older versions of CUDA. So, unless you want to be stuck using older versions of these programs, you'll eventually be forced to upgrade your hardware.

The problem is with these companies is they don't understand that if a consumer wants to do AI, they will be happy to drop 15k on the proper hardware to get an accurate result.
I don't know anyone who has $15k to spend on a graphics card for home use. I'm sure there are some, and they will be buying RTX Pro 6000 Blackwell cards, but I think people willing & able to spend that much on a graphics card are few and far between.
 
It is hard to comprehend what this means in the grand scheme of things.

Are these CUDA programs things that us nominal game users need to be concerned with or is this purely about programs that use the GPU for processing other things?

I mean, it would totally suck to have a nice working 1080P system with an older GPU only to not be able to play the latest games that come out.
This is not related to game. Even if you need CUDA older version of cuda still available. If you need new features you will going to need new hardware anyway since those new feature are not even available on 8-11 years old hardware. As for driver support maybe they will drop maxwell, pascal, volta soon.