Hi
Here, where I work, the backups of the postgresql databases are being done the following way:
There is a daily copy of nearly all the hd (excluding /tmp, /proc, /dev and so on) in which databases are
and besides this there is also one script which makes the pg_dump of each one of the databases on the server.
This daily copy of the hd is made with postmaster being active (without stopping the daemon), so the data
from /usr/local/pgsql/data would not be 100% consistent, I guess.
Supposing there was a failure and it was needed to restore the whole thing, I think the procedure to
recovery would be the following:
1) Copy data from the backup hd to a new hd
2) Once this was done, delete the postmaster.pid file and start the postmaster service
3) Drop all databases and recreate them from those pg_dump files
There are some questions I have about this backup routine:
If I recover data from that "inconsistent" backup hd, I know that the binary files (psql, pg_dump and so on)
will remain ok. The data may have some inconsistencies. Would these inconsistencies let the postmaster
start and work properly (that is, even with the possible presence of inconsistent data) Would it start and
be able to work normally and keep the information about users and groups? I am talking about users and
groups information because these ones are not dumped by pg_dump. I was thinking about using
"pg_dump -g" to generate this information.
I was also thinking about excluding /usr/local/pgsql/data from the backup routine, as the data is
also in other files generated by pg_dump. The problem is that this directory has not only the databases
data but also some config files, like postgresql.conf.
Opinions are welcome.