Hi,
I have a system that has a moderate amount of activity on it, nothing
strenuous. The activity is a real mixture of operations: selects,
updates, inserts and deletes. One thing strange about our database is
that we have a log of stored procedures that use temporary tables. Most
of the time the system runs fine, and a nightly vacuum does the job.
However, occasionally we need to import data, and this involves
inserting several million rows into a table, but this just *cripples*
postgres. After the import has been running for a while, simple selects
take a long time, and strangely, the query planner starts using a
sequential scan on a table with millions of items in it. Surely inserts
shouldn't be doing this? Should I be running a vacuum during the import?
After how many operations should I be calling vacuum? We tried using
pg_autovacuum but it didn't seem to do a very good job since we would
often find the query planner getting it wrong, and a vacuum analyze
would fix it.
Part of the import is to update items in a table to keep track of where
the import is up to. The update is just setting an integer column, so
does this create a dead tuple or will postgres alter the column in-place?
Thanks for your help
--
David Mitchell
Software Engineer
Telogis