On Apr 13, 2005, at 4:12 AM, Dawid Kuroczko wrote:
> On 4/12/05, Matt Van Mater <matt.vanmater@gmail.com> wrote:
>> I've been experimenting with loading a large amount of data into a
>> fairly simple database using both psql and perl prepared statements.
>> Unfortunately I'm seeing no appreciable differences between the two
>> methods, where I was under the impression that prepared statements
>> should be much faster (in my case, they are slightly slower).
>
> I've been playing with similar issue and in my case the best solution
> for bulk insert was using perl to format data in form suitable for COPY
> command.
I second this approach. Generally, getting the data into the database
can be done VERY quickly (for the 18k rows you have, it would likely be
instantaneous to copy them). I often create a separate "loader" schema
into which I load text files. Then, I can use SQL, triggers, or
functions to "clean up" the data, enforce referential integrity, etc.
within the database. If you have perl code to do this, you can
probably modify it just slightly to be used in a pl/perl function to do
the same thing as before, but now it is done on the server side and
will probably be significantly faster.
Sean