unable to avoid a deadlock at the end of a long transaction - Mailing list pgsql-general

From Ivan Sergio Borgonovo
Subject unable to avoid a deadlock at the end of a long transaction
Date
Msg-id 20100507103019.1fa2c329@dawn.webthatworks.it
Whole thread Raw
Responses Re: unable to avoid a deadlock at the end of a long transaction  (Tom Lane <tgl@sss.pgh.pa.us>)
List pgsql-general
I've been having this:

psql:include/custom/import_update.custom.sql:63: ERROR:  deadlock
detected DETAIL:  Process 13349 waits for AccessExclusiveLock on
relation 250510 of database 248569; blocked by process 14153.
Process 14153 waits for ShareLock on transaction 59160779; blocked
by process 13349. CONTEXT:  SQL statement "drop trigger if exists
FT1IDX_catalog_brands_update_trigger on catalog_brands" PL/pgSQL
function "ft1idx_trigger_drop" line 2 at SQL statement

I reshuffled the update process and I started to have other lock
problems.
The only process that I'm aware of that should be *writing* to the
tables involved is the update process. I'd expect other process are
reading but just the update should be writing.

The lock problem happens nearly at the end of the overall update
process that is one big transaction, reshuffling a bit the steps
doesn't make it go away... it just locks on other statements but
still at the *end* of the process after a bunch of update and insert
have been made on a bunch of other tables the largest of whom is an
update of roughly 85834 rows on a table containing 1M rows.

The only thing that look constant is: I get a deadlock at the end of
a long process on a random statement.

Where am I going to start from to solve this?

DETAIL:  Process 3662 waits for ShareLock on transaction 59301028;
blocked by process 4303. Process 4303 waits for ShareLock on
transaction 59299342; blocked by process 3662. CONTEXT:  SQL
statement "update catalog_items set Authors= $1  where ItemID= $2 "
PL/pgSQL function "updateauthorsall" line 19 at SQL statement

create or replace function UpdateAuthorsAll()
  returns void
  as
  $$
  declare
    _row record;
    _ItemID bigint;
    _Authors varchar(1024);
    _AuthorsOLD varchar(1024);
    _Name varchar(50);
  begin
    _Authors := '';
    _ItemID := null;
    for _row in select a.Name, ia.ItemID from   catalog_itemauthor
      ia join   catalog_author   a on a.AuthorID=ia.AuthorID
      order by ia.ItemID
      loop
      if(_row.ItemID<>_ItemID) then
        if(length(_Authors)>2) then
          _Authors := substring(_Authors from 3);
          select into _AuthorsOLD Authors from   catalog_items
            where ItemID=_ItemID;
          if(coalesce(_Authors, '')<>coalesce(_AuthorsOLD, '')) then
            update   catalog_items   set Authors=_Authors where
              ItemID=_ItemID;
          end if;
        end if;
        _Authors := '';
      end if;
      _ItemID := _row.ItemID;
      _Name := trim(E' \t' from _row.Name);
      if(length(_Name)>0) then
        _Authors := _Authors || ', ' || _Name;
      end if;
    end loop;
  return;
  end;
$$ language plpgsql volatile;

--
Ivan Sergio Borgonovo
http://www.webthatworks.it


pgsql-general by date:

Previous
From: Tom Lane
Date:
Subject: Re: How to deal with NULL values on dynamic queries?
Next
From: Sandeep Thakkar
Date:
Subject: Getting pgsnmpd using rsync/sup