Re: Large Result and Memory Limit - Mailing list pgsql-general

From André Volpato
Subject Re: Large Result and Memory Limit
Date
Msg-id 4705492B.7030807@ecomtecnologia.com.br
Whole thread Raw
In response to Re: Large Result and Memory Limit  (Mike Ginsburg <mginsburg@collaborativefusion.com>)
List pgsql-general
Mike Ginsburg escreveu:
> André Volpato wrote:
>> Mike Ginsburg escreveu:
>>> Hello,
>>>  I am working on a personnel registry that has upwards of 50,000
>>> registrants.  Currently I am working on an export module that will
>>> create a CSV from multiple tables.  I have managed to keep the
>>> script (PHP) under the memory limit
>> okay... some info needed here.
>> 1. memory on the DB server
>> 2. memory_limit on php.ini
> PHP Memory Limit is 16M.  We're running multiple installations on a
> single webserver, so memory is a concern
> DB Server is separate from the webserver.
>>> when creating and inserting the CSV into the database.  The problem
>>> comes when I try to query for the data and export it.  Memory limit
>>> is a major concern, but the query for one row returns a result set
>>> too large and PHP fails.
>> a single row is enough to crash PHP ?
> Well the "data" field in the table (text) contains 50K lines.  It's
> over 30M in size for the full export.

>>
>>>
>>> I've thought about storing the data in multiple rows and then
>>> querying one-by-one and outputting, but was hoping there was a
>>> better way.
>> if you can´t raise memory_limit, I think it´s the only way.
> I was afraid that would be the answer.


Well, you could try to retrieve data with substr(), say 10k lines in 5
queries.
It´s kinda ugly, but should work in this case...

[]´s
ACV



pgsql-general by date:

Previous
From: Alvaro Herrera
Date:
Subject: Re: Large Result and Memory Limit
Next
From: André Volpato
Date:
Subject: Re: Large Result and Memory Limit