CUDA-Enabled Apps: Measuing Mainstream GPU Performance

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I use badaboom to get vid on iPods all the time... and also to transcode my camcorder 1080p AVCHD home movies.

CUDA RULES, because I have a use for it. Makes my Q6600 feel like a 8088.

Don't knock the technology just because you cant use it.

Great article giving me an overview of the tech and apps so I can invest my few $$$ in the right software.

Thanks
 
CUDA has teh failure written all over the code man. Fail... Just FAIL
 
For me this is a pretty boring article...
The only thing I learned was that CUDA technology is useless unless you do some video encoding.
It does not help in gaming, or responsiveness in Windows Vista.
No wonder the mainstream market has so few people wanting it.

Perhaps it would be nice if CUDA was able to simulate a third processor (separate stream) in the OS.
That way dual core machines could use the simulated CPU as an additional core processor, and perhaps benefit more of it in daily activities.
Especially on mini netbooks like atom based notebooks, or low power devices. They need a graphics card and GPU anyways, so once equipped, why not make good use of it?
 
CUDA cannot be used as a regular processor becuase it is not an x86 processor. The hardware is designed to specifically accomodate that instruction set, you can not take any "processor" (stream or CPU) and run the same code on it. That is why the Nvidia Stream Processors only run h.264 right now. You might be able to write a compiler that can compile to both, but I'm guessing that would end up taking a lot more trouble than its worth.

Bottom line is, different hardware uses different instruction sets, which mean radically different assembly level code for the processor cores.
 
Yea to the Premiere guy, I use Premiere CS4 for editing also and don't consider myself a noob but when I have created my huge files and need them crunched down to share with my very noob family and friends, BADABOOM BABY!, and its done. I wouldn't even think of doing the conversion from anything else. Cuda is a tool that works. Just because people haven't yet found new ways to use it doesn't mean its just a worthless marketing ploy as some folks think. Use it once before you declare your "expertise", you will change your mind.
 
Good article. I was not aware there were even this many apps for CUDA at this point, and I have also been curious about Badaboom. I look forward to seeing an article about Stream vs. CUDA at some point in the future if ATI can git'er done.
 
There are a good sizable potion of CUDA apps just about all of them are transcoding or something to that effect though :)
http://www.nvidia.com/object/cuda_home.html#

Stream can catch on it's just it's hardware support is only the 4k's series? right? so i mean vs Nvidia which has CUDA support for things as far back as the Geforce 8's series which is 3 years old now vs hardware of just 1 year old much more people have CUDA enable hardware then stream and nvidia seems to actually push for cuda development while you rarely hear things about ati's stream past it's release.
 
Could any one tell me which card would have a faster processing time?

GeForce 8800 GTS 640/320 Mb (320-Bit / 96 Cores)
or the
GeForce 9800 GTX 512 Mb (256-Bit / 128 Cores)

Which one would take the crown? The one with the wider memory
interface or the one with more cores? :?

Thanks
 
8800gts doesn't support H.264 transcoding decoding encoding :) G80 8800 GTS GTX and ultra are not supported.

What about 3 Ghz vs 675Mhz want to feel puny lets compare transistors numbers of gpu vs cpu
I bet the gpu has more :) anyways i believe cuda is aimed at floating point applications although i think it lacks double precision.

http://www.nzone.com/page/nzone_section_andmore.html

Good list of cuda applications
 
CUDA is a solution to a problem that shouldn't exist. just give us general purpose vector floating point co-processors (like larrabee). so we don't have to hack pixel shaders and shit
 
[citation][nom]warezme[/nom]\Just because people haven't yet found new ways to use it doesn't mean its just a worthless marketing ploy as some folks think.[/citation]
The concept of GPGPU is not worthless, but CUDA itself is proprietry and probably won't stand in the long term against OpenCL. Of course it may, it's impossible to know for sure.
 
I dont get it, is this an article about CUDA performance vs CPU only, or an article to compare two nvidia cards ?

If this is about CUDA performance, the presentation of the result is terrible. The "with GPU" and "without GPU" results are in two separate boxes, using two different scales making it difficult to "see" the real performance advantage that CUDA brings....
 
For that guy who said Photoshop CS4 can't run on Geforce cards , it can run on every Geforce card , it uses openGL acceleration , not CUDA , Radeon HD 4850 is also supported .

 
distributed.net's beta RC5-72 CUDA client would be another good non-video data point. There is an enormous speedup when CUDA is used.
 
Fair enough, CUDA shows promise... but until there is a standard that you code to, it'll never really catch on (except for specialist niches where your data lends itself to being GPU crunched).

So, forget CUDA, more emphasis on OpenCL please. If that becomes more widespread, with better support, then we might see apps using it everywhere.
 
whats with this wacky prices i just picked up a second GTX 260 for $120 shipped. a 9600gt $120, blahaha some sucker fell for that.
 
Well I'm confused. Is the last test transcoding a DVD VOB (720 X 480i NTSC or 720×576i PAL) to a 1366 X 720 mp4 file? If so, what is the point? You can't get "extra resolution" from a lower rez source. A better test would be to take something like an HD off air recording and transcode it to something like 540p or DVD quality (something I do a lot).
If this isn't what happened on the test, then what was the output file specs?
 
"Still, you can see that CUDA gives us over a 100% improvement on the 480 test and a nearly 300% improvement when encoding to 1080."

No, I can't see that!
I can only read that, since the graphs do not have the same scale!
Hey, maybe the results with & without acceleration should even be shown on the same graph!
 
I have tried baddaboom and it takes longer to encode a movie with my 260/216 than handbreak does with just my i7 920 @ stock. now if it used both cpu and gpu like my Cyberlink does, that would be awesome.
 
[citation][nom]Shadow703793[/nom]SETI@Home is a dead cause. Those people need to donate their CPU/GPU power to Foldings@Home.[/citation]

They're both useless if you ask me. How many years has folding@home been around? And how many diseases has it cured?
 
Status
Not open for further replies.