Processing database query-results piecemeal - Mailing list pgsql-hackers

From Stephen R. van den Berg
Subject Processing database query-results piecemeal
Date
Msg-id 20080630111742.GA19746@cuci.nl
Whole thread Raw
Responses Re: Processing database query-results piecemeal  (Abhijit Menon-Sen <ams@oryx.com>)
List pgsql-hackers
I'm looking at the most efficient and lean way to interface with the
DB in a least-overhead scenario to process large(r) amounts of binary
data.

For simplicity, I want to avoid using the Large-Object facility.

It seems that the most efficient way to communicate with the DB would
be through PQexecParams(), which avoids the whole bytea-encoding issues.

However, two questions spring to mind:

- The docs say that you can use $1, $2, etc. to reference parameters. What happens if you have more than 9 parameters?
Doesit become $10 or ${10} or $(10) or is it simply not possible te reference more than nine parameters this way?
 

- Say that the SELECT returns 1000 rows of 100MB each, is there a way to avoid PQexecParams() from wanting to allocate
1000*100MB= 100GB at once, and somehow extract the rows in smaller chunks? (Incidentally, MySQL has such a facility).
I.e.we call libpq several times, and get a few rows at a time, which are read from the DB-stream when needed.
 
-- 
Sincerely,          Stephen R. van den Berg.


pgsql-hackers by date:

Previous
From: Gregory Stark
Date:
Subject: Re: TODO item: Allow data to be pulled directly from indexes
Next
From: Abhijit Menon-Sen
Date:
Subject: Re: Processing database query-results piecemeal