Nvidia's CUDA is Already 5 Years Old

Status
Not open for further replies.

AbdullahG

Distinguished
[citation][nom]plznote[/nom]Hasn't improved much, has it?[/citation]
From Nvidia:
Heart attacks are the leading cause of death worldwide. Harvard Engineering, Harvard Medical School and Brigham & Women's Hospital have teamed up to use GPUs to simulate blood flow and identify hidden arterial plaque without invasive imaging techniques or exploratory surgery.

The National Airspace System manages the nationwide coordination of air traffic flow. Computer models help identify new ways to alleviate congestion and keep airplane traffic moving efficiently. Using the computational power of GPUs, a team at NASA obtained a large performance gain, reducing analysis time from ten minutes to three seconds.

Adobe, Microsoft, and others use CUDA in several of their apps. CUDA also comes in handy in game and graphics development. The benefits also go beyond that. Just because "games" and such don't take advantage of CUDA doesn't mean it hasn't helped...

 

TheGuardian

Distinguished
Aug 30, 2011
20
0
18,510
Cuda also makes it possible for smaller companies or individuals to pump out higher quality animations. Using Cycles and only one of my 570s, I can render a frame in about 1/10th the time. A scene that would have taken 10hrs to render, now only takes an hour.
 

lordstormdragon

Distinguished
Sep 2, 2011
153
0
18,680
[citation][nom]alidan[/nom]id still rather have an open solution to cuda than cuda... nvidia has wronged me in the past and i will never for give them...[/citation]

They haven't personally wronged you, Alidan. Quit your crying.

There already are several open solutions. OpenCL for one is coming along nicely. But this requires that the hardware manufacturers meet THEIR specs. With CUDA, the software meets Nvidia's specs.

There are simply things OpenCL cannot do that CUDA can. And OpenCL is excellent of course, but AMD cards and Nvidia cards are simply built differently. It's apples and oranges, when it comes to GPU compute performance. Gaming it would be apples and apples, but raw computations are a different monster...

If you've ever used iRay (mental images) or Vray-RT for GPU rendering, you'd know the difference. Since you haven't, it'll give you a point to research.

 

mark0718

Distinguished
Jul 18, 2008
30
0
18,530
"the remaining 5 to 10 percent of performance that is hidden in a GPU can only be accessed via detailed knowledge of the architecture of the GPU, especially its memory architecture."

Since when is 10% a big deal?
 

_Cubase_

Distinguished
Jun 18, 2009
363
0
18,780
[citation][nom]caedenv[/nom]quite a difference in Adobe's mercury engine between CUDA being on and off![/citation]

Amen to that! CUDA has been a dream for us video editors using Premiere! Not needing to render much (if at all) has enabled my CUDA cards to pay for themselves over and over.
 
[citation][nom]AbdullahG[/nom]From Nvidia: Adobe, Microsoft, and others use CUDA in several of their apps. CUDA also comes in handy in game and graphics development. The benefits also go beyond that. Just because "games" and such don't take advantage of CUDA doesn't mean it hasn't helped...[/citation]

I'm pretty certain that PhysX uses CUDA, so in reality, games do take advantage of it.
 

tomfreak

Distinguished
May 18, 2011
1,334
0
19,280
Common software for casual consumer are still lacking of support. They need to push something on this area, not just development softwares or University research software.
 
CUDA has made some inroads since 5 years ago, but it could be so much better. I'm actually surprised that resource-intensive software like Mainconcept has not taken to support CUDA yet.
 
Status
Not open for further replies.