Re: Attempting to delete excess rows from table with BATCH DELETE - Mailing list pgsql-general

From David G. Johnston
Subject Re: Attempting to delete excess rows from table with BATCH DELETE
Date
Msg-id CAKFQuwa96OCxiW5yLtnUVvN1U4208o9c3NvLxnL1MkojLhQcVg@mail.gmail.com
Whole thread Raw
In response to Attempting to delete excess rows from table with BATCH DELETE  (Gus Spier <gus.spier@gmail.com>)
Responses Re: Attempting to delete excess rows from table with BATCH DELETE
List pgsql-general
On Tuesday, January 27, 2026, Gus Spier <gus.spier@gmail.com> wrote:
Environment: AWS RDS Aurora for PostgreSQL 15 hosting tables that
support scientific research. The development environment predominantly
uses JPA with Hibernate.

Years of neglect have allowed mission tables to accumulate hundreds of
millions of rows of excess data. The developers and the customer
decided we must delete all rows older than 75 days. Table partitioning
was briefly considered but discarded because of the effort needed to
refactor the codebase.

I proposed the straight-forward course of action: delete by batches
from the victim tables.


Strongly encourage you to try to accomplish your goal without any delete commands at that scale that causes vacuuming.  Can you just create an empty copy and load the data to keep into it then point at the newly filled database?  Truncate is OK.

Daily trimming going forward would be less problematic at least.

David J.

pgsql-general by date:

Previous
From: Gus Spier
Date:
Subject: Attempting to delete excess rows from table with BATCH DELETE
Next
From: "David G. Johnston"
Date:
Subject: Re: Attempting to delete excess rows from table with BATCH DELETE