"Peter Kovacs" <maxottovonstirlitz@gmail.com> writes:
> We have a number of automated performance tests (to test our own code)
> involving PostgreSQL. Test cases are supposed to drop and recreate
> tables each time they run.
> The problem is that some of the tests show a linear performance
> degradation overtime. (We have data for three months back in the
> past.) We have established that some element(s) of our test
> environment must be the culprit for the degradation. As rebooting the
> test machine didn't revert speeds to baselines recorded three months
> ago, we have turned our attention to the database as the only element
> of the environment which is persistent across reboots. Recreating the
> entire PGSQL cluster did cause speeds to revert to baselines.
What it sounds like to me is that you're not vacuuming the system
catalogs, which are getting bloated with dead rows about all those
dropped tables.
regards, tom lane