Hi !
I use two scheduled crons to first vacuum and then dump all dbs.
Then I have a simple shell script to tarball the dump and to move it to
another machine with rsync. Not sure if this works with blobs.
BR,
aarni
#!/bin/bash
#
#Backupscript
#
cd /work/backup #directory where the files come
tar cfz pg_daily_dumps.gz /usr/share/pgdumps/all
export RSYNC_PASSWORD="xxxxxxxxxxxxxxxxxxxxxxxxxx"
rsync -azR . psw8326@212.94.64.10::my.server.com
#eof
On Sunday 23 June 2002 08:18 pm, you wrote:
> I am looking to setup a reliable and automatic way to backup my Postgresql
> database to a remote server.
>
> What I have tried is to use the PG_Dumpall function to dump the contents of
> my databases. And then have a remote machine use SMB to back up the files
> using our backup system. My problem is that pg_dumpall wants a password for
> the account I want to use to run the backup.
>
> Is there a better way to do this. If my understanding is correct, It would
> be a bad thing for me to just try to tarball the /pgsql folder when the
> databases is running. I cannot readily take the database offline to run the
> backup as it is a key part of running a webserver and some network
> utilities.
>
> Does anyone have an suggestions on how to do this well. I also have to be
> able to backup my BLOBS that are in the database.
>
> Thanks in advance.
>
>
>
> ---------------------------------------------------------------------------
>----- Eric Naujock CCNA, CCDA, A+, Network +, I-Net +
> Abacus II
> 5610 Monroe St.
> Sylvania, Ohio 43560
> <http://www.abacusii.com>
> E-mail - naujocke@abacusii.com
> Phone - 419-885-0082 X 241
> Fax : 419-885-2717
> AOL IM: erlic