Re: cursor interface to libpq - Mailing list pgsql-interfaces

From Tom Lane
Subject Re: cursor interface to libpq
Date
Msg-id 29420.969640317@sss.pgh.pa.us
Whole thread Raw
In response to Re: cursor interface to libpq  (Thomas Lockhart <lockhart@alumni.caltech.edu>)
List pgsql-interfaces
Thomas Lockhart <lockhart@alumni.caltech.edu> writes:
> afaik this should all work. You can run pg_dump and pipe the output to a
> tape drive or to gzip. You *know* that a real backup will take something
> like the size of the database (maybe a factor of two or so less) since
> the data has to go somewhere.

pg_dump in default mode (ie, dump data as COPY commands) doesn't have a
problem with huge tables because the COPY data is just dumped out in a
streaming fashion.

If you insist on using the "dump data as insert commands" option then
huge tables cause a memory problem in pg_dump, but on the other hand you
are going to get pretty tired of waiting for such a script to reload,
too.  I recommend just using the default behavior ...
        regards, tom lane


pgsql-interfaces by date:

Previous
From: Cedar Cox
Date:
Subject: ODBC - invalid protocol character
Next
From: "Chris Haas"
Date:
Subject: Re: upgrade pgaccess v0.96 to v0.98 fatal error: attribute querytables not found