Re: JSON for PG 9.2 - Mailing list pgsql-hackers

From Merlin Moncure
Subject Re: JSON for PG 9.2
Date
Msg-id CAHyXU0yoH=6xj58CRN2_TGkaP-fBiYksjvDaPhnESCNz2JdQkA@mail.gmail.com
Whole thread Raw
In response to Re: JSON for PG 9.2  (Hannu Krosing <hannu@2ndQuadrant.com>)
List pgsql-hackers
On Mon, Apr 16, 2012 at 11:19 AM, Hannu Krosing <hannu@2ndquadrant.com> wrote:
> If doing something in 9.3 then what I would like is some way to express
> multiple queries. Basically a variant of
>
> query_to_json(query text[])
>
> where queries would be evaluated in order and then all the results
> aggregated into on json object.

I personally don't like variants of to_json that push the query in as
text. They defeat parameterization and have other issues.  Another
point for client side processing is the new row level processing in
libpq, so I'd argue that if the result is big enough to warrant
worring about buffering (and it'd have to be a mighty big json doc),
the best bet is to extract it as rows.  I'm playing around with
node.js for the json serving and the sending code looks like this:
 var first = true;
 query.on('row', function(row) {   if(first) {     first = false;     response.write('[');   }   else
response.write(',');  response.write(row.jsondata); }); query.on('end', function() {   response.write(']');
response.end();});
 
-- not too bad

merlin


pgsql-hackers by date:

Previous
From: Magnus Hagander
Date:
Subject: Re: Bug tracker tool we need (was: Last gasp)
Next
From: Alvaro Herrera
Date:
Subject: Re: how to create a non-inherited CHECK constraint in CREATE TABLE