Thread: "stack depth limit exceeded" executing DELETE FROM
The following fails with a "stack depth limit exceeded":
DELETE FROM some-table WHERE pk IN (pk1, pk2, ..., pkN)
where pk = primary key and N is large, may be > 50000.
Is there some limit as to the number of rows that can
be deleted in a single DELETE?
Thanks,
Brian
"Cox, Brian" <Brian.Cox@ca.com> writes: > The following fails with a "stack depth limit exceeded": > DELETE FROM some-table WHERE pk IN (pk1, pk2, ..., pkN) > where pk = primary key and N is large, may be > 50000. > Is there some limit as to the number of rows that can > be deleted in a single DELETE? Not as such, but there's a limit on the complexity of WHERE clauses. Raising max_stack_depth might help you, but you'll run out of RAM eventually. (FWIW, 8.2 will cope a lot better with this particular construct.) regards, tom lane
Not a limit to a single DELETE per se, a single delete could remove millions of rows, but you can't specify that many individual entries in an IN (..) section. This isn't a PG-specific issue by the way; Oracle limits you to about 1000 entries IIRC, and SQL Server limits it to 512 or thereabouts. It's just a bad approach. You want to refactor this to use a transaction and JDBC update batches, to gain the benefits of batching without the stack overrun problems. If you're concerned about speed you could try benchmarking to find the optimum size for the number of items in the "IN (...)" clause, but in our testing the optimum number for most databases was 1. For PG I think it was 2. Moral of the story: it's hard to beat individual batch deletes. An alternative approach would be to add these doomed PK's to a temporary table and then do "DELETE FROM myTable WHERE pk IN (SELECT victimId FROM temp_doomed_table)" -- Mark On Tue, 2006-09-05 at 14:18 -0400, Cox, Brian wrote: > The following fails with a "stack depth limit exceeded": > > DELETE FROM some-table WHERE pk IN (pk1, pk2, ..., pkN) > > where pk = primary key and N is large, may be > 50000. > > Is there some limit as to the number of rows that can > be deleted in a single DELETE? > > Thanks, > Brian > >
On Tue, Sep 05, 2006 at 02:18:05PM -0400, Cox, Brian wrote: > The following fails with a "stack depth limit exceeded": > > DELETE FROM some-table WHERE pk IN (pk1, pk2, ..., pkN) > > where pk = primary key and N is large, may be > 50000. > > Is there some limit as to the number of rows that can > be deleted in a single DELETE? The error isn't due to the number of rows being deleted but rather from parsing the large number of items in the IN list. See the following thread: http://archives.postgresql.org/pgsql-general/2006-01/msg00959.php How are you obtaining the N values? A subquery that returns the equivalent list should work. You could also try increasing max_stack_depth (or max_expr_depth in 7.4 and earlier), but be sure to stay within your system's stack depth resource limit. -- Michael Fuhr
On Sep 5, 2006, at 14:44 , Michael Fuhr wrote: > On Tue, Sep 05, 2006 at 02:18:05PM -0400, Cox, Brian wrote: >> The following fails with a "stack depth limit exceeded": >> >> DELETE FROM some-table WHERE pk IN (pk1, pk2, ..., pkN) >> >> where pk = primary key and N is large, may be > 50000. >> >> Is there some limit as to the number of rows that can >> be deleted in a single DELETE? > > The error isn't due to the number of rows being deleted but rather > from parsing the large number of items in the IN list. See the > following thread: > > http://archives.postgresql.org/pgsql-general/2006-01/msg00959.php You have been receiving great advice from the other guys, but another thing to try is to use some middleware to determine which ranges your delete set covers and use WHERE pk BETWEEN x AND y, potentially improving index usage performance. -M