Alvaro Herrera escreveu: <blockquote cite="mid:20071004200537.GB28896@alvh.no-ip.org" type="cite"><pre wrap="">Mike
Ginsburgwrote: </pre><blockquote type="cite"><pre wrap="">Hello,I am working on a personnel registry that has upwards
of50,000
registrants. Currently I am working on an export module that will create a
CSV from multiple tables. I have managed to keep the script (PHP) under
the memory limit when creating and inserting the CSV into the database.
The problem comes when I try to query for the data and export it. Memory
limit is a major concern, but the query for one row returns a result set
too large and PHP fails. </pre></blockquote><pre wrap="">
One row? Wow, I didn't know PHP was that broken.
Try declaring a cursor and fetching a few rows at a time.</pre></blockquote> PHP is just respecting memory_limit when
retrievingdata. <br /> In this case, a single row is about 30M, a lot more than the limit of 16M.<br /> I think cursors
wouldn´thelp anyway.<br /><br /> []´s,<br /> ACV<br /><br />