2013/7/14 Haiming Zhang <Haiming.Zhang@redflex.com.au>:
> Hi All,
>
>
>
> I am using postgres 9.1, I have a question about updating big table. Here is
> the basic information of this table.
>
> 1. This table has one primary key and other 11 columns.
>
> 2. It also has a trigger that before update records,
> another table got updated first.
>
> 3. The has millions of records now.
> 4. I am not allowed to delete records in this table when
> UPDATE
>
> The problem is when I do a "Update" query it takes a long time to execute.
> Eg. when I run query like this " update TABLE set column1 = true where
> EVENT_ID in (select EVENT_ID from TABLE2 );" , it took hours to update the
> whole table. In order to optimize the update speed. I tried the following
> strategies:
>
> 1. create index based on primary key, column1 and
> combination of primary key and column1.
>
> 2. Alter FILLFACTOR = 70, vacuum all and then reindex
>
> 3. drop trigger before update
>
> Then I use "EXPLAIN" to estimate query plan, all of the above strategies do
> not improve the UPDATE speed dramatically.
>
>
>
> Please comments on my three strategies (eg, does I index too many columns in
> 1?) and please advise me how to improve the update speed. Any advice is
> welcomed. I appreciate all you help.
>
>
>
> Thanks,
>
>
>
> Regards,
>
> Haiming
A JOIN would solve your speed problem.
The IN() predicate is the cause.
AFAIK.