Tips/advice for implementing integrated RESTful HTTP API - Mailing list pgsql-hackers

From Dobes Vandermeer
Subject Tips/advice for implementing integrated RESTful HTTP API
Date
Msg-id CADbG_jYNmkHN48ZX+wuwGpbr6xixAghh9FgTU5RDk_278UhDBw@mail.gmail.com
Whole thread Raw
Responses Re: Tips/advice for implementing integrated RESTful HTTP API
Re: Tips/advice for implementing integrated RESTful HTTP API
List pgsql-hackers

A while back I was working on a little proposal to create a RESTful HTTP front-end for PostgreSQL and recently I had the inspiration to work on this.  So I successfully created a background worker for PostgreSQL 9.3 that can use the SPI to list off the databases in a JSON response.

Now I'm getting into murkier waters and I'm wonder if I can get some helpful tips to guide my R&D here.

1. Connecting to multiple databases

The background workers can apparently only connect to a single database at a time, but I want to expose all the databases via the API. 

I think I could use libpq to connect to PostgreSQL on localhost but this might have weird side-effects in terms of authentication, pid use, stuff like that.

I could probably manage a pool of dynamic workers (as of 9.4), one per user/database combination or something along those lines.  Even one per request?  Is there some kind of IPC system in place to help shuttle the requests and responses between dynamic workers?  Or do I need to come up with my own?

It seems like PostgreSQL itself has a way to shuttle requests out to workers, is it possible to tap into that system instead?  Basically some way to send the requests to a PostgreSQL backend from the background worker?

Or perhaps I shouldn't do this as a worker but rather modify PostgreSQL itself and do it in a more integrated/destructive manner?

2. Authentication

I was trying to use a function md5_crypt_verify to authenticate the user using their password, and I believe I am providing the right password but it's not being accepted.

Any tips on authenticating users in a background worker?   Where should I be looking for examples?

3. Parallelism

The regular PostgreSQL server can run many queries in parallel, but it seems like if I am using SPI I could only run one query at a time - it's not an asynchronous API.

This seems related to the multiple databases issue - either I could use libpq to translate/forward requests onto PostgreSQL's own worker system or setup my own little worker pool to run the requests in parallel and have a way to send the request/response data to/from those workers.



Any help, sage advice, tips, and suggestions how to move forward in these areas would be muchly appreciated!

Regards,

Dobes


pgsql-hackers by date:

Previous
From: Peter Eisentraut
Date:
Subject: Re: improving speed of make check-world
Next
From: Kohei KaiGai
Date:
Subject: Re: [v9.5] Custom Plan API