Re: Copying large tables with DBLink - Mailing list pgsql-admin

From Joe Conway
Subject Re: Copying large tables with DBLink
Date
Msg-id 42431326.2010908@joeconway.com
Whole thread Raw
In response to Copying large tables with DBLink  ("Chris Hoover" <revoohc@sermonaudio.com>)
List pgsql-admin
Chris Hoover wrote:
> Has anyone had problems with memory exhaustion and dblink?  We were
> trying to use dblink to convert our databases to our new layout, and had
> our test server lock up several times when trying to copy a table that
> was significantly larger than our memory and swap.
> Basically where were doing an insert into <table> select * from
> dblink('dbname=olddb','select * from large_table) as t_large_table(table
> column listing);
>
> Does anyone know of a way around this?


dblink just uses libpq, and libpq reads the entire result into memory.
There is no direct way around that that I'm aware of. You could,
however, use a cursor, and fetch/manipulate rows in more reasonably
sized groups.

HTH,

Joe

pgsql-admin by date:

Previous
From: "Chris Hoover"
Date:
Subject: Copying large tables with DBLink
Next
From: Tom Lane
Date:
Subject: Re: Copying large tables with DBLink