Hello!
I have 24/7 production server under high load.
I need to perform vacuum full on several tables to recover disk
space / memory usage frequently ( the server must be online during
vacuum time )
The one trick that i see is to try to vacuum duplicate of
production database ( or just some tables ).
But there are some pitfalls:
http://www.postgresql.org/docs/7.4/interactive/backup-file.html
"If you have dug into the details of the file system layout of
the data you may be tempted to try to back up or restore only
certain individual tables or databases from their respective
files or directories. This will not work because the
information contained in these files contains only half the
truth. The other half is in the commit log files pg_clog/*,
which contain the commit status of all transactions. A table
file is only usable with this information. Of course it is also
impossible to restore only a table and the associated pg_clog
data because that would render all other tables in the database
cluster useless."
Any thoughts?
Thanks in advance,
Aleksey