Thread: Pg_dump very large database

Pg_dump very large database

From
"Nikolay Mihaylov"
Date:


Hi all.
I have a database with very large tables - about 2+ GB per table.

When I'm using pg_dump tool, it get all memory available, then linux
crash (or kernel kills the most of processes including pg_dump)

For this reason I made small php script which dump data using 'fetch
next in cursor' (PHP is lame for this but is perfect for me).
I tried to patch pg_dump, but I cant understand most of the code (never
works with pg blobs from C).

I'm attaching the files I use in order to share them with all you.

Nikolay.

P.s.
Shell scripts are for calling the php scripts.
dump1.sh - create list with shell commands needs to be executed in order
to get backup.
dump.sh - backup single table

-----------------------------------------------------------
The Reboots are for hardware upgrades,
Found more here: http://www.nmmm.nu
Nikolay Mihaylov nmmm@nmmm.nu

Attachment

Re: Pg_dump very large database

From
Tom Lane
Date:
"Nikolay Mihaylov" <nmmm@nmmm.nu> writes:
> I have a database with very large tables - about 2+ GB per table.

> When I'm using pg_dump tool, it get all memory available, then linux
> crash (or kernel kills the most of processes including pg_dump)

This is fixed in 7.2's pg_dump, which as far as I know can be used
safely with 7.1 installations, should you not want to update the whole
installation just yet.

            regards, tom lane