"Creager, Robert S" <CreagRS@LOUISVILLE.STORTEK.COM> writes:
> What I've done is copy the original table into a file, and am now attempting
> to copy from stdin, using Perl/Pg to break out the data into the 6 tables.
> I'm working with 2.5 million records btw. I've narrowed the situation to
> occur when copying to any one of the 5 referring tables (COPY u FROM stdin).
> The backend process which handles the db connection decides that it needs a
> whole lot of memory, although in a nice controlled manner. The backend
> starts with using 6.5Mb, and at 25000 records copied, it's taken 10Mb and
> has slowed down substantially. Needless to say, this COPY will not finish
> before running out of memory (estimated 300Mb).
Ah, another memory leak that's yet un-plugged. Can you gin up a
self-contained example that reproduces the leak? Should be fixable
if we can figure out exactly where the leak is occurring.
regards, tom lane