IBM Files Patent For GPU-Accelerated Databases

Status
Not open for further replies.

gokanis

Distinguished
Apr 26, 2011
233
0
18,690
1
interesting and maybe even very useful considering a GPUs power, but did they develop it yet? You should not be able to patent it without a product. That is what is wrong with the system. If I patented slim, I could sue thousands of anorexics. Oh wait, apple has that one.
 
G

Guest

Guest
me too
even more after reading the book IBM and the holocaust.
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
0
[citation][nom]gokanis[/nom]interesting and maybe even very useful considering a GPUs power, but did they develop it yet? You should not be able to patent it without a product. That is what is wrong with the system. If I patented slim, I could sue thousands of anorexics. Oh wait, apple has that one.[/citation]

did they pattent a process
 
G

Guest

Guest
So, it relies on having a table already in GPU memory, that is small enough to fit into same GPU memory? Furthermore, have they figured out a way to make GPUs useful for that? Current GPUs are basically good at floating point calculations that are easily parallelized, not integer/logic operations. SQL is almost entirely integer/logic unless your query/insert contains an FP math equation, which probably isn't often.

Barring the wrong tool for the job argument, it could work OK as long as you're entire database fits in one GPUs memory, or else you can at least fit one table per GPU, and you only have to query a single table at a time.
 

killerclick

Distinguished
Jan 13, 2010
1,563
0
19,790
2
They've been promising GPU based processing for five years at least, so far it hasn't made an impact anywhere. 3ds max uses it in a very limited way as does Photoshop.
 
G

Guest

Guest
Killerclick: The number one thing holding it back is that it requires copying from main memory to the GPUs memory. Even things that can be made that parallel must still overcome the huge latency in copying it back and forth. That's why AMD Llano and all APUs going forward share memory with the CPU, and have direct links to the cores, to make GPGPU more feasible for the calculations that can benefit from it.
 

memadmax

Distinguished
Mar 25, 2011
2,492
0
19,960
95
Great for a couple of small DB's but after that you will tax it too much, Why?

Unless you have a dedicated GPU unit on the same bus as the main processor to access main ram, you still have to go thru the main cpu, thru the main bus(PCI-E), into the gpu, back out into the main bus, back thru the cpu and out onto main RAM or Storage.....

The End.
 

Novulux

Distinguished
Sep 15, 2011
127
0
18,710
5
[citation][nom]ibmtrollpatent[/nom]me tooeven more after reading the book IBM and the holocaust.[/citation]


Yeah, I just learned about what the Nazis used their machines for too.
 
Great for a couple of small DB's but after that you will tax it too much, Why?
QUOTE:
Unless you have a dedicated GPU unit on the same bus as the main processor to access main ram, you still have to go thru the main cpu, thru the main bus(PCI-E), into the gpu, back out into the main bus, back thru the cpu and out onto main RAM or Storage.....

Since the days of AGP(maybe earlier) Memory transfers to the videocard do not have to go through the CPU. Windows allocates a set area of ram for the Videocard to have direct access to. It is assigne a DMA (direct memory access) the same as a hard drive.
So the video card can read or write to memory directly, reguardless of what the cpu is doing.
 

LORD_ORION

Distinguished
Sep 12, 2007
814
0
18,980
0
Except anyone who has ever played with CUDA already made extensions for pretty much every open source databases and they do precisely that.
 

acadia11

Distinguished
Jan 31, 2010
899
0
18,980
0
[citation][nom]LORD_ORION[/nom]Except anyone who has ever played with CUDA already made extensions for pretty much every open source databases and they do precisely that.[/citation]

Which makes me go how? There needs to be more scrutiny in the patent process.
 

bit_user

Splendid
Ambassador
Search on gpgpu.org and you'll find papers about running DB queries on GPUs dating back to 2004. The subject was even covered in a SIGGRAPH 2005 course on GPGPU.

What I wonder is how much of their patent is even specific to GPUs. Like how does it materially differ from query processing on other parallel architectures. Could you just substitute SMP or NUMA for GPU and find a dozen other patents like it?

IBM patents _a_lot_ of stuff. They might even believe they invented some of it.
 

izmanq

Distinguished
May 24, 2009
125
0
18,680
0
Hell, this patent should not be approve, why do any one need to pay license to use any processing unit (in this case from GPU) to do anything :|
 

danwat1234

Distinguished
Jun 13, 2008
1,392
0
19,310
9
[citation][nom]memadmax[/nom]Great for a couple of small DB's but after that you will tax it too much, Why? Unless you have a dedicated GPU unit on the same bus as the main processor to access main ram, you still have to go thru the main cpu, thru the main bus(PCI-E), into the gpu, back out into the main bus, back thru the cpu and out onto main RAM or Storage.....The End.[/citation]

Yes the article seems to be assuming that accessing databases is mathematically bottlenecked instead of I/O bottlenecked.
If this is true, then it might make for a good GPU application.
The onboard RAM on GPUs can greatly reduce I/O, especially if most of the index can fit in the RAM. So then PCI-Express usage would be reduced and the latency therein would be less of an issue.
 

back_by_demand

Splendid
BANNED
Jul 16, 2009
4,822
0
22,780
0
[citation][nom]ibmtrollpatent[/nom]me tooeven more after reading the book IBM and the holocaust.[/citation]
Do you like Christianity?
You realise the Pope during WW2 was a Nazi sympathiser.
http://en.wikipedia.org/wiki/Hitler's_Pope
If you are going to demonise an organisation fro activities nearly 70 years ago you have to do the same for every other organisation too.
Level playing field, that's how it works.

By the way:-
However, IBM is not surprisingly trying to protect its patent, if granted, in other programming languages
That's because IBM is not Apple and will be happy for this to proliferate and help everyone.
 
Status
Not open for further replies.

ASK THE COMMUNITY