--- btober@seaworthysys.com wrote:
>
> > I have cron execute a Python script as the
> database
> > administrator to vacuum and backup all databases.
> > Rather than dump all databases at once, however,
> the
> > script performs a 'pgsql -l' to get a current list
> of
> > databases. Each database is dumped and piped into
> > gzip for compression into its own backup file.
> >
> > I should also mention that the script renames all
> > previous backup files, all ending in *.gz, to
> > *.gz.old; so that they survive the current
> pg_dump.
> > Of course, you could change the script to put the
> date
> > in the file name as to keep unlimited backup
> versions.
>
> FWIW, another good way to handle the last paragraph
> would be to use
> logrotate. It would handle renaming files as *.1,
> *.2,... and you could
> specify the number of days you wanted it to retain
> and you don't have to
> go in periodically and delete ancient backups so
> that your drive doesn't
> fill up.
>
Thanks, I think I'll modify the script to manage a
declared number of backups as described above.
Logrotate sounds like FreeBSD's Newsyslog.conf. The
reason I don't use it is that I would have to
configure each database's backup file. The Python
script adds new databases and backup files to the
process automatically. This is one of those "if I get
hit by a bus" features. As my databases do not have
IS support, my boss insists on contingency planning.
Best regards,
Andrew Gould