Re: Understanding Transactions - Mailing list pgsql-general

From Stephan Szabo
Subject Re: Understanding Transactions
Date
Msg-id 20011212102303.G90917-100000@megazone23.bigpanda.com
Whole thread Raw
In response to Understanding Transactions  (Joe Koenig <joe@jwebmedia.com>)
List pgsql-general
On Wed, 12 Dec 2001, Joe Koenig wrote:

> I've been reading through the archive and I see that when doing a large
> amount of inserts it is much faster to wrap a bunch of them in a
> transaction. But here's my question. Say I need to do about 100,000
> inserts and using COPY isn't an option. Is postgres going to do the
> inserts faster in groups of 1,000 or 5,000? I know that letting each
> insert be in its own transaction creates a lot of overhead, but I didn't
> know if putting 5,000 inserts into a transaction created overhead for
> that transaction. Hopefully my question makes sense. Thanks,

Well, it depends on the schema to some extent probably.  If the table
has foreign keys, there was a problem (it's been fixed but I don't
know in what version) with the deferred trigger manager on long
transactions.  1k or 5k rows is probably okay in any case.



pgsql-general by date:

Previous
From: Dado Feigenblatt
Date:
Subject: Re: User rights across databases
Next
From: "Brian Hirt"
Date:
Subject: ACK table corrupted, unique index violated.