Re: Re: Connection hanging on INSERT apparently due to large batch size and 4 CPU cores - Mailing list pgsql-jdbc

From Kris Jurka
Subject Re: Re: Connection hanging on INSERT apparently due to large batch size and 4 CPU cores
Date
Msg-id Pine.BSO.4.64.0810291326250.14951@leary.csoft.net
Whole thread Raw
In response to Re: Connection hanging on INSERT apparently due to large batch size and 4 CPU cores  (John <jgassner@gmail.com>)
List pgsql-jdbc

On Sat, 25 Oct 2008, John wrote:

> Could it be related to the INSERT being done on a table that has 330
> columns?
>

Yes, that's it (or at least part of it).  In the simple case of using the
same, known parameter types for each batch entry, the data returned from
the server has no dependence on parameter count.  The issue arises when
binding a parameter with unspecified type.  In the attached
SimpleDeadLockDemo, you can see the driver issuing repeated
Describe(statement=S_1) which results in a ParameterDescription message
whose size depends on the number of parameters.  With a large parameter
count, the backend is going to fill its network buffer with this data and
we'll get the deadlock you're seeing.

In this case we've supplied an explicit parameter type at parse time, so
describing it isn't going to tell us anything new.  Even if it was going
to provide us information, we only need to issue the describe once, not
for every batch entry.  I'm not sure how complicated a solution might be,
but at least I understand what's going on now.

Also attached (BatchDeadLock) is a test case that reliably locks up as
you've described.

Kris Jurka

Attachment

pgsql-jdbc by date:

Previous
From: Oliver Jowett
Date:
Subject: Re: Driver memory usage on select and autocommit
Next
From: Mikko Tiihonen
Date:
Subject: Binary transfer patches - again