hi guys,
we have database containing 3 three tables.
1. some static tables with 300 records.
2. vbv
3. vbv_denorm
4. summary
here is how stuff works:
1. one process is receiving data from our sensor network and inserts it to vbv
2. there is trigger on vbv which add the new records with some other information into vbv_denorm --> we use this for our reporting software
3. there is trigger on vbv_denorm which aggregates the data in vbv_denorm and inserts into summary
4. another process is sending summary records to other company in periodic bases. it sends records one by one. (I me it opens a cursor and send one record. after confirmation sends another record)
this operation runs 24*7.
because the database is growing too much(beyond tra byte in less than 6 months) we need to delete the oldest data. in each delete we remove about 80 million records from vbv(there is a trigger to delete the records from vbv_denorm and summary). after delete autovacuum starts on vbv. at the same time because summary is small (at maximum 2 million records) I ran VACUUM ANALYZE on it(it is not VACUUM FULL). but this cause the database to completely block. the process inserting to vbv blocks. the process sending summary records blocks. Why this is happening? autovacuum and VACUUM ANALYZE should not lock tables.