If your backup file still hit that limit even after being compressed, you may want to use the split command to split it into chunks with the desired size that suit your O/S.
----
Husam
Jyry Kuukkanen wrote:
On Thu, 4 May 2006, Rodrigo Sakai wrote:
Hi, I'm trying to do a backup of a database that is larger then 4 GB. But it gets an error when the file size gets 1.2 GB! I think its an Operational System problem (linux)! So, I want to know if exists some solution to backup my database??
The command that I used was=
pg_dump -U postgres -d dbdeveloper -a -v -D -f 'backup.sql'
The operational system is linux with etx filesystem, and the version of postgres is 7.4!!
You can try:
pg_dump -U postgres -d dbdeveloper -a -v -D |bzip2 -c >backup.sql.bz2
bzip2 compresses better than gzip.
Restoration:
bzcat backup.sql.bz2|psql - postgres -d dbdeveloper
Cheers,
I use the same but use the bzip2 '--best' parameter for best compression or is that the default? Also for the OP Rodrigo Sakai, have your tried Solaris 10 x86? It is a rock solid OS from Sun that I have used for 12 years - 6 of them with Postgres. The two have coexisted together without incident. It can be d/led for free at the Sun site.
**********************************************************************
This message contains confidential information intended only for the use of the addressee(s) named above and may contain information that is legally privileged. If you are not the addressee, or the person responsible for delivering it to the addressee, you are hereby notified that reading, disseminating, distributing or copying this message is strictly prohibited. If you have received this message by mistake, please immediately notify us by replying to the message and delete the original message immediately thereafter.
Thank you.
FADLD Tag
**********************************************************************