Re: bulk inserts - Mailing list pgsql-general

From Sam Mason
Subject Re: bulk inserts
Date
Msg-id 20090928205236.GD5407@samason.me.uk
Whole thread Raw
In response to bulk inserts  (Dave Huber <DHuber@letourneautechnologies.com>)
Responses Re: bulk inserts  (Dave Huber <DHuber@letourneautechnologies.com>)
Re: bulk inserts  (Martin Gainty <mgainty@hotmail.com>)
List pgsql-general
On Mon, Sep 28, 2009 at 10:38:05AM -0500, Dave Huber wrote:
> Using COPY is out of the question as the file is not formatted for
> that and since other operations need to occur, the file needs to be
> read sequentially anyway.

Just to expand on what Martin said; if you can generate a set of EXECUTE
commands, you can certainly generate a COPY command to insert the same
data.  The advantage is a large drop in parse time for inserting larger
numbers of rows.  As you're saying you want to insert 500 rows, I'd
suggest at least trying to get COPY working.

--
  Sam  http://samason.me.uk/

pgsql-general by date:

Previous
From: Scott Marlowe
Date:
Subject: Re: PostgreSQL reads each 8k block - no larger blocks are used - even on sequential scans
Next
From: Magnus Hagander
Date:
Subject: pgday.eu 2009: Schedule available and registration open