Re: Copying large tables with DBLink - Mailing list pgsql-admin

From Michael Fuhr
Subject Re: Copying large tables with DBLink
Date
Msg-id 20050324195059.GA13146@winnie.fuhr.org
Whole thread Raw
In response to Copying large tables with DBLink  ("Chris Hoover" <revoohc@sermonaudio.com>)
List pgsql-admin
On Thu, Mar 24, 2005 at 01:59:44PM -0500, Chris Hoover wrote:
>
> Has anyone had problems with memory exhaustion and dblink?  We were
> trying to use dblink to convert our databases to our new layout, and had
> our test server lock up several times when trying to copy a table that
> was significantly larger than our memory and swap.

Hmmm...doesn't dblink use libpq, and doesn't libpq fetch the entire
result set before doing anything with it?  If so, then that could
explain the memory exhaustion.

> Basically where were doing an insert into <table> select * from
> dblink('dbname=olddb','select * from large_table) as t_large_table(table
> column listing);
>
> Does anyone know of a way around this?

How about using pg_dump to dump the original table and restore it
into the new table?  If you just want the table's contents without
the table definition then you could use the -a (--data-only) option.

Another possibility would be to write a function that uses a cursor:
dblink_open() and a loop that calls dblink_fetch() until you reach
the end of the result set.  I think that wouldn't have a memory
exhaustion problem (but test it to be sure).

--
Michael Fuhr
http://www.fuhr.org/~mfuhr/

pgsql-admin by date:

Previous
From: Tom Lane
Date:
Subject: Re: Copying large tables with DBLink
Next
From: Kris Kiger
Date:
Subject: Very worried about this