Thread: Using the GPU
Aren’t most databases constrained by I/O? And postgresql by how fast your kernel can switch between processes under a concurrent load?
From: pgsql-hackers-owner@postgresql.org [mailto:pgsql-hackers-owner@postgresql.org] On Behalf Of Billings, John
Sent: Friday, June 08, 2007 10:55 AM
To: pgsql-hackers@postgresql.org
Subject: [HACKERS] Using the GPU
Does anyone think that PostgreSQL could benefit from using the video card as a parallel computing device? I'm working on a project using Nvidia's CUDA with an 8800 series video card to handle non-graphical algorithms. I'm curious if anyone thinks that this technology could be used to speed up a database? If so which part of the database, and what kind of parallel algorithms would be used?
Thanks, sorry if this is a duplicate message.
-- John Billings
"Billings, John" <John.Billings@PAETEC.com> writes: > Does anyone think that PostgreSQL could benefit from using the video > card as a parallel computing device? I'm working on a project using > Nvidia's CUDA with an 8800 series video card to handle non-graphical > algorithms. I'm curious if anyone thinks that this technology could be > used to speed up a database? If so which part of the database, and what > kind of parallel algorithms would be used? There has been some interesting research on sorting using the GPU which could be very interesting for databases. However I think Postgres would be unlikely to go the route of having compiled driver code for every possible video card. It's unlikely to be interesting for database developers until there's some abstract interface designed for these kinds of optimizations which it can use without caring about the specific graphics card. Perhaps this can be done using OpenGL already but I kind of doubt it. -- Gregory Stark EnterpriseDB http://www.enterprisedb.com
2007/6/9, Gregory Stark <stark@enterprisedb.com>: > There has been some interesting research on sorting using the GPU which could > be very interesting for databases. > > However I think Postgres would be unlikely to go the route of having compiled > driver code for every possible video card. It's unlikely to be interesting for > database developers until there's some abstract interface designed for these > kinds of optimizations which it can use without caring about the specific > graphics card. > > Perhaps this can be done using OpenGL already but I kind of doubt it. <url:http://en.wikipedia.org/wiki/GLSL> There are (of course) competing "standards" such as: <url:http://en.wikipedia.org/wiki/High_Level_Shader_Language> and: <url:http://en.wikipedia.org/wiki/Cg_%28programming_language%29>. greetings, Nicolas -- Nicolas Barbier http://www.gnu.org/philosophy/no-word-attachments.html
Gregory Stark wrote: > "Billings, John" <John.Billings@PAETEC.com> writes: > >> Does anyone think that PostgreSQL could benefit from using the video >> card as a parallel computing device? I'm working on a project using >> Nvidia's CUDA with an 8800 series video card to handle non-graphical >> algorithms. I'm curious if anyone thinks that this technology could be >> used to speed up a database? If so which part of the database, and what >> kind of parallel algorithms would be used? > > There has been some interesting research on sorting using the GPU which could > be very interesting for databases. Without knowing a thing about all of this, my first thought it might be useful for GIS and things of that sort. regards, Lukas
On Sat, June 9, 2007 07:36, Gregory Stark wrote: > "Billings, John" <John.Billings@PAETEC.com> writes: > >> Does anyone think that PostgreSQL could benefit from using the video >> card as a parallel computing device? I'm working on a project using >> Nvidia's CUDA with an 8800 series video card to handle non-graphical >> algorithms. I'm curious if anyone thinks that this technology could be >> used to speed up a database? If so which part of the database, and what >> kind of parallel algorithms would be used? > > There has been some interesting research on sorting using the GPU which > could be very interesting for databases. > Perhaps this can be done using OpenGL already but I kind of doubt it. GPUs have been used to great effect for spatial joins. And yes, using OpenGL so that it was portable. I saw a paper about that as an Oracle plugin a few years back. It works something like this, IIRC: a spatial join looks for objects that overlap with the query area. Normally you go through an R-tree index to identify objects that are in the same general area (space-filling curves help there). Then you filter the objects you get, to see which ones actually overlap your query area. The GL trick inserted an intermediate filter that set up the objects found in the R-tree index, and the query area, as 3D objects. Then it used GL's collision detection as an intermediate filter to find apparent matches. It has to be slightly conservative because GL doesn't make the sort of guarantees you'd want for this trick, so there's a final software pass that only needs to look at cases where there's any doubt. Jeroen