Re: time taking deletion on large tables - Mailing list pgsql-admin

From Rui DeSousa
Subject Re: time taking deletion on large tables
Date
Msg-id 35E33498-8CD9-4D81-A641-42343342D337@crazybean.net
Whole thread Raw
In response to time taking deletion on large tables  (Atul Kumar <akumar14871@gmail.com>)
List pgsql-admin


On Dec 3, 2020, at 9:45 AM, Atul Kumar <akumar14871@gmail.com> wrote:

The feed_posts table has over 50 Million rows.

When I m deleting all rows of a certain type that are over 60 days old.

When I try to do a delete like this:  it hangs for an entire day, so I
need to kill it with pg_terminate_backend(pid).

Delete the records in batches. I have used this approach many times successfully for large tables that are highly active on live production systems. 

You’ll have to find the correct batch size to use for your dataset while keeping the run time short; i.e. 30 seconds.  Then repeatedly call the function using a script — I’ve used a perl script with the DBI module to accomplish it. 

i.e. 

create or replace function purge_feed_post (_purge_date date, _limit int default 5000)
  returns int 
as 
$$
  declare 
    _rowcnt int;
  begin
    create temp table if not exists purge_feed_post_set (
      feed_post_id int
      )
    ;

    /* Identify records to be purged */
    insert into purge_feed_post_set (
      feed_post_id
      )
      select feed_post_id
      from feed_posts 
      where created_at < _purge_date
      order by created_at
      limit _limit
    ;
    
    /* Remove old records */
    delete from feed_posts using purge_feed_post_set 
      where feed_posts.feed_post_id = purge_feed_post_set.feed_post_id
    ;

    get diagnostics _rowcnt = ROW_COUNT;

    delete from purge_feed_post_set;

    return _rowcnt;
  end;
$$ language plpgsql
  set search_path = public
;

pgsql-admin by date:

Previous
From: Yambu
Date:
Subject: Moving a table/index to different tablespace
Next
From: Henry Francisco Garcia Cortez
Date:
Subject: autovacuum_work_mem and maintenance_work_mem