On 24 Feb 2003 at 15:53, Ericson Smith wrote:
> Currently our database is at about 15gigs. Over the period of a month,
> it grows to about 25gigs (10Gb wasted space). Every month we have a
> cleanup routine which involves copying the most actively updated tables
> to text, and importing the data again. We vacuum every night and analyze
> 4 times per day, but we're loath to do a VACUUM FULL because of the
> table locking issues (locking some of the tables would break the
> operation of some of our 24/7 systems), hence we prefer to stop the db
> about once per month, eat the downtime as scheduled (about 1.5 hours),
> and get back to business for the next 30 days again.
We ahd a discussion on this few days back and the solution might work as well
for you(apart from suggestions you have already received).
Instead of vacuum full on a table, backup the table to a dump file, drop it and
recreate it. It takes more efforts than simple vacuum full but may run much
faster if you have large amount of space to recover.
Try it.
Bye
Shridhar
--
"I'd crawl over an acre of 'Visual This++' and 'Integrated DevelopmentThat' to
get to gcc, Emacs, and gdb. Thank you."(By Vance Petree, Virginia Power)