Re: Very long deletion time on a 200 GB database - Mailing list pgsql-performance

From Reuven M. Lerner
Subject Re: Very long deletion time on a 200 GB database
Date
Msg-id 4F465A7A.4000905@lerner.co.il
Whole thread Raw
In response to Re: Very long deletion time on a 200 GB database  (Andrew Dunstan <andrew@dunslane.net>)
Responses Re: Very long deletion time on a 200 GB database  ("ktm@rice.edu" <ktm@rice.edu>)
Re: Very long deletion time on a 200 GB database  (Greg Spiegelberg <gspiegelberg@gmail.com>)
List pgsql-performance
Hi, everyone.  Thanks for all of the help and suggestions so far; I'll
try to respond to some of them soon.  Andrew wrote:

>> How about:
>>
>> DELETE FROM B
>> WHERE r_id IN (SELECT distinct R.id
>> FROM R WHERE r.end_date< (NOW() - (interval '1 day' * 30))
>>
>> ?
>>
>
> Or possibly without the DISTINCT. But I agree that the original query
> shouldn't have B in the subquery - that alone could well make it crawl.

I put B in the subquery so as to reduce the number of rows that would be
returned, but maybe that was indeed backfiring on me.  Now that I think
about it, B is a huge table, and R is a less-huge one, so including B in
the subselect was probably a mistake.

>
> What is the distribution of end_dates? It might be worth running this in
> several steps, deleting records older than, say, 90 days, 60 days, 30 days.

I've suggested something similar, but was told that we have limited time
to execute the DELETE, and that doing it in stages might not be possible.

Reuven


--
Reuven M. Lerner -- Web development, consulting, and training
Mobile: +972-54-496-8405 * US phone: 847-230-9795
Skype/AIM: reuvenlerner

pgsql-performance by date:

Previous
From: Andy Colson
Date:
Subject: Re: set autovacuum=off
Next
From: "ktm@rice.edu"
Date:
Subject: Re: Very long deletion time on a 200 GB database