Re: How import big amounts of data? - Mailing list pgsql-performance

From Ron
Subject Re: How import big amounts of data?
Date
Msg-id 6.2.5.6.0.20051229085429.01db84d8@earthlink.net
Whole thread Raw
In response to How import big amounts of data?  (Arnau <arnaulist@andromeiberica.com>)
List pgsql-performance
At 04:48 AM 12/29/2005, Arnau wrote:
>Hi all,
>
>   Which is the best way to import data to tables? I have to import
> 90000 rows into a column and doing it as inserts takes ages. Would
> be faster with copy? is there any other alternative to insert/copy?
Compared to some imports, 90K rows is not that large.

Assuming you want the table(s) to be in some sorted order when you
are done, the fastest way to import a large enough amount of data is:
-put the new data into a temp table (works best if temp table fits into RAM)
-merge the rows from the original table and the temp table into a new table
-create the indexes you want on the new table
-DROP the old table and its indexes
-rename the new table and its indexes to replace the old ones.

If you _don't_ care about having the table in some sorted order,
-put the new data into a new table
-COPY the old data to the new table
-create the indexes you want on the new table
-DROP the old table and its indexes
-rename the new table and its indexes to replace the old ones

Either of these procedures will also minimize your downtime while you
are importing.

If one doesn't want to go to all of the trouble of either of the
above, at least DROP your indexes, do your INSERTs in batches, and
rebuild your indexes.
Doing 90K individual INSERTs should usually be avoided.

cheers,
Ron



pgsql-performance by date:

Previous
From: Dennis Bjorklund
Date:
Subject: Re: How import big amounts of data?
Next
From: Teemu Torma
Date:
Subject: Re: How import big amounts of data?