Re: Import csv file into multiple tables in Postgres - Mailing list pgsql-novice

From Sean Davis
Subject Re: Import csv file into multiple tables in Postgres
Date
Msg-id 008401c51d89$0868fb30$1f6df345@WATSON
Whole thread Raw
In response to Import csv file into multiple tables in Postgres  (Deepblues <deepblues@gmail.com>)
Responses Re: Import csv file into multiple tables in Postgres
List pgsql-novice
----- Original Message -----
From: "Andrew Hammond" <ahammond@ca.afilias.info>
To: "Deepblues" <deepblues@gmail.com>
Cc: <pgsql-novice@postgresql.org>
Sent: Sunday, February 27, 2005 9:28 PM
Subject: Re: [NOVICE] Import csv file into multiple tables in Postgres


> The brief answer is no, you can not import from a single csv file into
> multiple tables.
>
> If the csv file consists of two distinct sections of data, then you could
> of course split it into two csv files. If what you want to do is normalize
> existing data, then you should first import the existing data into a
> working table. Then you can manipulate it within the database.
>
> It is unlikely that you will need perl to do any of this.

I use perl a lot for stuff like this, but have found that in most cases, the
easiest thing to do is to load the data into a single postgresql table and
then create sql for doing the selects and inserts to then create the
multiple tables.  This has the added advantage that you get to keep a copy
of the original data available in case you don't put every column into the
"working" database.  If you end up doing this a lot, you can create a
separate "loader" schema that contains all of these raw csv tables in one
place, not visible by most users so as not to confuse the "working" schema.

Sean



pgsql-novice by date:

Previous
From: "Sean Davis"
Date:
Subject: Re: Using upper() / decode() together
Next
From: "Ross Gohlke"
Date:
Subject: Re: Using upper() / decode() together