Hi all,
We're using pg_dump to backup our databases. The actual pg_dump
appears to work fine. On smaller (< approx. 100 Meg) data sets, the
restore also works, but on larger data sets the restore process
consistently fails.
Other facts that may be of interest:
* We're running Postgres 7.2.3 on a more-or-less stock Red Hat 7.3
platform. * Backup is done with "pg_dump -c -U postgres", then gzip * Restore is via "cat <archive_file> | gunzip |
psql"
The particular file I'm wrestling with at the moment is ~2.2 Gig
unzipped. If you try to restore using pg_restore, the process
immediately fails with the following:
pg_restore: [archiver] could not open input file: File too large
When the data file is gzip'd, you can at least get the restore process
started with the following:
cat archive_file.gz | gunzip | psql dbname
The above command line starts OK, but eventually fails with:
server closed the connection unexpectedly This probably means the server terminated abnormally before
orwhile processing the request. connection to server was lost