Re: Better way to bulk-load millions of CSV records into postgres? - Mailing list pgsql-novice

From Joel Burton
Subject Re: Better way to bulk-load millions of CSV records into postgres?
Date
Msg-id JGEPJNMCKODMDHGOBKDNGEPMCOAA.joel@joelburton.com
Whole thread Raw
In response to Better way to bulk-load millions of CSV records into postgres?  (Ron Johnson <ron.l.johnson@cox.net>)
List pgsql-novice
> -----Original Message-----
> From: pgsql-novice-owner@postgresql.org
> [mailto:pgsql-novice-owner@postgresql.org]On Behalf Of Ron Johnson
> Sent: Tuesday, May 21, 2002 4:40 PM
> To: PgSQL Novice ML
> Subject: [NOVICE] Better way to bulk-load millions of CSV records into
> postgres?
>
>
> Hi,
>
> Currently, I've got a python script using pyPgSQL that
> parses the CSV record, creates a string that is a big
> "INSERT INTO VALUES (...)" command, then, execute() it.
>
> top shows that this method uses postmaster with ~70% CPU
> utilization, and python with ~15% utilization.
>
> Still, it's only inserting ~190 recs/second.  Is there a
> better way to do this, or am I constrained by the hardware?
>
> Instead of python and postmaster having to do a ton of data
> xfer over sockets, I'm wondering if there's a way to send a
> large number of csv records (4000, for example) in one big
> chunk to a stored procedure and have the engine process it
> all.

You could change your Python script to output a COPY command, which is
*much* faster than individual INSERT commands.

- J.

Joel BURTON | joel@joelburton.com | joelburton.com | aim: wjoelburton
Knowledge Management & Technology Consultant


pgsql-novice by date:

Previous
From: Victor Manuel Torres Aguirre
Date:
Subject: PostgreSQL+Access97+Linux: How to..
Next
From: Ron Johnson
Date:
Subject: Re: PostgreSQL+Access97+Linux: How to..