Recommended technique for large imports? - Mailing list pgsql-general

From Stephen Bacon
Subject Recommended technique for large imports?
Date
Msg-id 1032038564.4025.37.camel@babylon.13x.com
Whole thread Raw
Responses Re: Recommended technique for large imports?  (Jeff Davis <list-pgsql-general@empires.org>)
List pgsql-general
Hello,

I'm running a tomcat-based web app (Tomcat 3.3 under Linux 7.3) with
PostgreSQL (7.2.1) as the back end. I need to add new import
functionality. From previous importer experience with this site, I'm
worried that it can take so long that the user's browser times out
waiting for the process to complete (only ever happens when they're
importing a lot of records when the system is under heavy demand - the
main set of tables have a lot of indexes, so the loop / insert method
can take a bit).

Of course the data gets in there, but the user can end up with a
404-type of error anyways and no one likes to see that.

Now I know the COPY command is much faster because it doesn't update the
indexes after every row insert, but building that and passing it via
jdbc seems iffy (or C, PHP, etc. for that matter).

Can anyone give a recommended technique for this sort of process?

Basically (I think) I need to do something like:

 Start transaction
 Turn off indexing for this transaction
 loop 1..n
   insert record X
 end loop
 Turn indexing back on
 Commit / End transaction

thanks,
  -Steve

(appologies for the cross-post, but I figured it's not specifically jdbc
related)



pgsql-general by date:

Previous
From: Tom Lane
Date:
Subject: Re: Query having issues...
Next
From: Glen Eustace
Date:
Subject: Wht the SEQ Scan ?