Re: Large Result and Memory Limit - Mailing list pgsql-general

From Scott Marlowe
Subject Re: Large Result and Memory Limit
Date
Msg-id dcc563d10710041347y4a38c943g58dea986d61917e5@mail.gmail.com
Whole thread Raw
In response to Re: Large Result and Memory Limit  (Mike Ginsburg <mginsburg@collaborativefusion.com>)
List pgsql-general
On 10/4/07, Mike Ginsburg <mginsburg@collaborativefusion.com> wrote:

>  This is for the export only.  Since it is an export of ~50,000 registrants,
> it takes some time to process.  We also have load balanced web servers, so
> unless I want to create identical processes on all webservers, or write some
> crazy script to scp it across the board, storing it as a text file is not an
> option.  I realize that my way of doing it is flawed, which the reason I
> came here for advice.  The CSV contains data from approximately 15 tables,
> several of which are many-to-ones making joins a little tricky.  My thought
> was to do all of the processing in the background, store the results in the
> DB, and allowing the requester to download it at their convenience.
>
>  Would it be a good idea to create a temporary table that stored all of the
> export data in it broken out by rows and columns, and when download time
> comes, query from their?

Yeah, I tend to think that would be better.  Then you could use a
cursor to retrieve then and serve them one line at a time and not have
to worry about overloading your php server.

pgsql-general by date:

Previous
From: Mike Ginsburg
Date:
Subject: Re: Large Result and Memory Limit
Next
From: Bill Moran
Date:
Subject: Re: Large Result and Memory Limit