Nvidia's CUDA is Already 5 Years Old

  • Thread starter Thread starter Guest
  • Start date Start date
Status
Not open for further replies.
[citation][nom]plznote[/nom]Hasn't improved much, has it?[/citation]
From Nvidia:
Heart attacks are the leading cause of death worldwide. Harvard Engineering, Harvard Medical School and Brigham & Women's Hospital have teamed up to use GPUs to simulate blood flow and identify hidden arterial plaque without invasive imaging techniques or exploratory surgery.

The National Airspace System manages the nationwide coordination of air traffic flow. Computer models help identify new ways to alleviate congestion and keep airplane traffic moving efficiently. Using the computational power of GPUs, a team at NASA obtained a large performance gain, reducing analysis time from ten minutes to three seconds.

Adobe, Microsoft, and others use CUDA in several of their apps. CUDA also comes in handy in game and graphics development. The benefits also go beyond that. Just because "games" and such don't take advantage of CUDA doesn't mean it hasn't helped...

 
Cuda also makes it possible for smaller companies or individuals to pump out higher quality animations. Using Cycles and only one of my 570s, I can render a frame in about 1/10th the time. A scene that would have taken 10hrs to render, now only takes an hour.
 
[citation][nom]alidan[/nom]id still rather have an open solution to cuda than cuda... nvidia has wronged me in the past and i will never for give them...[/citation]

They haven't personally wronged you, Alidan. Quit your crying.

There already are several open solutions. OpenCL for one is coming along nicely. But this requires that the hardware manufacturers meet THEIR specs. With CUDA, the software meets Nvidia's specs.

There are simply things OpenCL cannot do that CUDA can. And OpenCL is excellent of course, but AMD cards and Nvidia cards are simply built differently. It's apples and oranges, when it comes to GPU compute performance. Gaming it would be apples and apples, but raw computations are a different monster...

If you've ever used iRay (mental images) or Vray-RT for GPU rendering, you'd know the difference. Since you haven't, it'll give you a point to research.

 
"the remaining 5 to 10 percent of performance that is hidden in a GPU can only be accessed via detailed knowledge of the architecture of the GPU, especially its memory architecture."

Since when is 10% a big deal?
 
[citation][nom]caedenv[/nom]quite a difference in Adobe's mercury engine between CUDA being on and off![/citation]

Amen to that! CUDA has been a dream for us video editors using Premiere! Not needing to render much (if at all) has enabled my CUDA cards to pay for themselves over and over.
 
[citation][nom]AbdullahG[/nom]From Nvidia: Adobe, Microsoft, and others use CUDA in several of their apps. CUDA also comes in handy in game and graphics development. The benefits also go beyond that. Just because "games" and such don't take advantage of CUDA doesn't mean it hasn't helped...[/citation]

I'm pretty certain that PhysX uses CUDA, so in reality, games do take advantage of it.
 
Common software for casual consumer are still lacking of support. They need to push something on this area, not just development softwares or University research software.
 
Status
Not open for further replies.