André Volpato wrote:
> Mike Ginsburg escreveu:
>> Hello,
>> I am working on a personnel registry that has upwards of 50,000
>> registrants. Currently I am working on an export module that will
>> create a CSV from multiple tables. I have managed to keep the script
>> (PHP) under the memory limit
> okay... some info needed here.
> 1. memory on the DB server
> 2. memory_limit on php.ini
PHP Memory Limit is 16M. We're running multiple installations on a
single webserver, so memory is a concern
DB Server is separate from the webserver.
>> when creating and inserting the CSV into the database. The problem
>> comes when I try to query for the data and export it. Memory limit
>> is a major concern, but the query for one row returns a result set
>> too large and PHP fails.
> a single row is enough to crash PHP ?
Well the "data" field in the table (text) contains 50K lines. It's over
30M in size for the full export.
>
>>
>> I've thought about storing the data in multiple rows and then
>> querying one-by-one and outputting, but was hoping there was a better
>> way.
> if you can´t raise memory_limit, I think it´s the only way.
I was afraid that would be the answer.
>
> []´s
> ACV
>
>
>
>
>
>
>
Mike Ginsburg
Collaborative Fusion, Inc.
mginsburg@collaborativefusion.com
412-422-3463 x4015