Hi Tomas,
Tomas wrote:
We've decided to remove unneeded 'old' data, which means removing about
99.999% of rows from tables A, C and D (about 2 GB of data). At the
beginning, the B table (containing aggregated from A, C and D) was emptied
(dropped and created) and filled in with current data. Then, before the
deletion the data from tables A, C, D were backed up using another tables
(say A_old, C_old, D_old) filled in using
.....
1) drop, create and fill table B (aggregated data from A, C, D)
2) copy 'old' data from A, C and D to A_old, C_old a D_old
3) delete old data from A, C, D
4) dump data from A_old, C_old and D_old
5) truncate tables A, C, D
6) vacuum full analyze tables A, C, D, A_old, C_old and D_old
----
I think you do some difficult database maintainance. Why you do that, if you
just want to have some small piece of datas from your tables. Why don't you
try something like:
1. create table A with no index (don't fill data to this table),
2. create table A_week_year inherit table A, with index you want, and some
condition for insertion. (eg: table A1 you used for 1 week data of a year
and so on..)
3. do this step for table B, C and D
4. if you have relation, make the relation to inherit table (optional).
I think you should read the postgresql help, for more information about
table inheritance.
The impact is, you might have much table. But each table will only have
small piece of datas, example: just for one week. And you don't have to do a
difficult database maintainance like you have done. You just need to create
tables for every week of data, do vacuum/analyze and regular backup.
Best regards,
ahmad fajar,