hi, we have a weird situation here. we have a table of approx. 10k rows
representing accumulated activity by specific customers. as information
is gathered those customers rows are updated. the number of rows does not
increase unless we get a new customer so that is not a factor. the table
is defined as follows:
Table "account_summary_02" Attribute | Type | Modifier
-------------+-------------+----------bill_br_id | bigint | not nullcust_id | varchar(15) | not nullbtn_id
| varchar(15) | not nullln_id | varchar(15) | not nullct_key | float8 | not nullas_quantity | float8
| not nullas_charges | float8 | not nullas_count | float8 | not null
Index: account_summary_02_unq_idx
the index is on the first 5 columns. here's the situation. after about
50,000
updates, which fly right along, the process begins to really bog down. we
perform
a vacuum analzye and it speeds right up again. my question is, is there a way
to perform these updates, potentially 500k to 1 million in a day, without
having
to vacuum so frequently? maybe some setting or parameter to be changed?
the update
query is doing an index scan.
mikeo