process hang during drop schema - Mailing list pgsql-general
From | David Parker |
---|---|
Subject | process hang during drop schema |
Date | |
Msg-id | 07FDEE0ED7455A48AC42AC2070EDFF7C14912B@corpsrv2.tazznetworks.com Whole thread Raw |
Responses |
Re: process hang during drop schema
|
List | pgsql-general |
I have a process that hangs doing a "drop schema cascade delete". This happens in a slonik command, which runs fine if Irun it directly from the command line, but hangs if run from inside my process. I'm pretty sure I'm doing something sillyand it's not a slony-specific thing, and I'm hoping that somebody can point me to a server debugging technique I canuse to track this down. I'm using 7.4.5. My server, run with "-d 5", stops logging at this point: [...] NOTICE: drop cascades to cast from _tzreplic.xxid to xid NOTICE: drop cascades to cast from xid to _tzreplic.xxid NOTICE: drop cascades to function _tzreplic.xxidout(_tzreplic.xxid) NOTICE: drop cascades to function _tzreplic.xxidin(cstring) DEBUG: CommitTransactionCommand There's nothing after that (until I log in with psql or something). This is what things look like in pg_stat_activity and pg_locks (pid 19472 is the current psql session, and 19467 is the hung process): policy=# select * from pg_stat_activity; datid | datname | procpid | usesysid | usename | current_query | query_start -------+---------+---------+----------+-------------+---------------+------------- 17142 | policy | 19143 | 101 | tazz:pool | | 17142 | policy | 19467 | 102 | tazz:dbmgr | | 17142 | policy | 19472 | 103 | tazz:engmon | | (3 rows) policy=# select * from pg_locks order by pid,relation; relation | database | transaction | pid | mode | granted ----------+----------+-------------+-------+---------------------+--------- 20367 | 17142 | | 19467 | AccessExclusiveLock | t 20369 | 17142 | | 19467 | AccessExclusiveLock | t 20371 | 17142 | | 19467 | AccessExclusiveLock | t 20372 | 17142 | | 19467 | AccessExclusiveLock | t 20374 | 17142 | | 19467 | AccessExclusiveLock | t 20376 | 17142 | | 19467 | AccessExclusiveLock | t 20378 | 17142 | | 19467 | AccessExclusiveLock | t 20379 | 17142 | | 19467 | AccessExclusiveLock | t 20385 | 17142 | | 19467 | AccessExclusiveLock | t 20387 | 17142 | | 19467 | AccessExclusiveLock | t 20389 | 17142 | | 19467 | AccessExclusiveLock | t 20390 | 17142 | | 19467 | AccessExclusiveLock | t 20400 | 17142 | | 19467 | AccessExclusiveLock | t 20402 | 17142 | | 19467 | AccessExclusiveLock | t 20404 | 17142 | | 19467 | AccessExclusiveLock | t 20405 | 17142 | | 19467 | AccessExclusiveLock | t 20407 | 17142 | | 19467 | AccessExclusiveLock | t 20413 | 17142 | | 19467 | AccessExclusiveLock | t 20415 | 17142 | | 19467 | AccessExclusiveLock | t 20421 | 17142 | | 19467 | AccessExclusiveLock | t 20423 | 17142 | | 19467 | AccessExclusiveLock | t 20425 | 17142 | | 19467 | AccessExclusiveLock | t 20426 | 17142 | | 19467 | AccessExclusiveLock | t 20428 | 17142 | | 19467 | AccessExclusiveLock | t 20434 | 17142 | | 19467 | AccessExclusiveLock | t 20436 | 17142 | | 19467 | AccessExclusiveLock | t 20438 | 17142 | | 19467 | AccessExclusiveLock | t 20439 | 17142 | | 19467 | AccessExclusiveLock | t 20449 | 17142 | | 19467 | AccessExclusiveLock | t 20451 | 17142 | | 19467 | AccessExclusiveLock | t 20461 | 17142 | | 19467 | AccessExclusiveLock | t 20463 | 17142 | | 19467 | AccessExclusiveLock | t 20473 | 17142 | | 19467 | AccessExclusiveLock | t 20475 | 17142 | | 19467 | AccessExclusiveLock | t 20477 | 17142 | | 19467 | AccessExclusiveLock | t 20478 | 17142 | | 19467 | AccessExclusiveLock | t 20480 | 17142 | | 19467 | AccessExclusiveLock | t 20483 | 17142 | | 19467 | AccessExclusiveLock | t 20484 | 17142 | | 19467 | AccessExclusiveLock | t 20485 | 17142 | | 19467 | AccessExclusiveLock | t 20487 | 17142 | | 19467 | AccessExclusiveLock | t 20489 | 17142 | | 19467 | AccessExclusiveLock | t 20492 | 17142 | | 19467 | AccessExclusiveLock | t 20494 | 17142 | | 19467 | AccessExclusiveLock | t 20496 | 17142 | | 19467 | AccessExclusiveLock | t 20497 | 17142 | | 19467 | AccessExclusiveLock | t 20498 | 17142 | | 19467 | AccessExclusiveLock | t 20500 | 17142 | | 19467 | AccessExclusiveLock | t 20502 | 17142 | | 19467 | AccessExclusiveLock | t 20503 | 17142 | | 19467 | AccessExclusiveLock | t 20504 | 17142 | | 19467 | AccessExclusiveLock | t 20506 | 17142 | | 19467 | AccessExclusiveLock | t 20508 | 17142 | | 19467 | AccessExclusiveLock | t 20510 | 17142 | | 19467 | AccessExclusiveLock | t 20512 | 17142 | | 19467 | AccessExclusiveLock | t 20514 | 17142 | | 19467 | AccessExclusiveLock | t | | 1301 | 19467 | ExclusiveLock | t 16759 | 17142 | | 19472 | AccessShareLock | t | | 1304 | 19472 | ExclusiveLock | t (59 rows) To my untrained eye, it doesn't look as though there is any lock contention here, but haven't dealt with postgres locking before, so.... Is there another table I should be looking at, or another debug switch I should be setting? TIA for any suggestions. - DAP ---------------------------------------------------------------------------------- David Parker Tazz Networks (401) 709-5130
pgsql-general by date: