Re: Import csv file into multiple tables in Postgres - Mailing list pgsql-novice

From Deepblues
Subject Re: Import csv file into multiple tables in Postgres
Date
Msg-id bd6cfb9e05030312106a92d646@mail.gmail.com
Whole thread Raw
In response to Re: Import csv file into multiple tables in Postgres  (Keith Worthington <KeithW@NarrowPathInc.com>)
Responses Re: Import csv file into multiple tables in Postgres
Re: Import csv file into multiple tables in Postgres
List pgsql-novice
Thanks for all that information, My scenario looks like this ...

 I need to import an excel spreadsheet into the postgresql database .
I converted the excel spreadsheet into a csv file and now I have 3
tables in my database where I need to import this data. Im new to both
perl and postgres. do you have any sample script that I can have a
look at which loads the csv file into a interface table ?

Deepblues


On Tue, 01 Mar 2005 12:06:34 -0500, Keith Worthington
<KeithW@narrowpathinc.com> wrote:
> Sean Davis wrote:
>
> >
> > ----- Original Message ----- From: "Andrew Hammond"
> > <ahammond@ca.afilias.info>
> > To: "Deepblues" <deepblues@gmail.com>
> > Cc: <pgsql-novice@postgresql.org>
> > Sent: Sunday, February 27, 2005 9:28 PM
> > Subject: Re: [NOVICE] Import csv file into multiple tables in Postgres
> >
> >
> >> The brief answer is no, you can not import from a single csv file
> >> into multiple tables.
> >>
> >> If the csv file consists of two distinct sections of data, then you
> >> could of course split it into two csv files. If what you want to do
> >> is normalize existing data, then you should first import the existing
> >> data into a working table. Then you can manipulate it within the
> >> database.
> >>
> >> It is unlikely that you will need perl to do any of this.
> >
> >
> > I use perl a lot for stuff like this, but have found that in most
> > cases, the easiest thing to do is to load the data into a single
> > postgresql table and then create sql for doing the selects and inserts
> > to then create the multiple tables.  This has the added advantage that
> > you get to keep a copy of the original data available in case you
> > don't put every column into the "working" database.  If you end up
> > doing this a lot, you can create a separate "loader" schema that
> > contains all of these raw csv tables in one place, not visible by most
> > users so as not to confuse the "working" schema.
> >
> > Sean
>
> I do nearly exactly as Sean has suggested all the time.
>
> We have a schema called data_transfer that contains all of the tables
> for accepting data.  (As an aside the data is written into these tables
> by a shell script using the COPY command.)  Each of the tables has a
> trigger that fires and executes a function.  The function is responsible
> for taking the data out of the data_transfer table and putting it in
> one, two or three target tables.  In addition to the advantage noted by
> Sean you can also manipulate the data during this process as well as
> check for duplicates dynamically change between INSERT and UPDATE
> commands, etc, etc.
>
> --
> Kind Regards,
> Keith
>
>


--
Deepti Mandava
Graduate Teaching Assistant,
The University of Kansas
Lawrence, KS  66045-7546

pgsql-novice by date:

Previous
From:
Date:
Subject: Re: Blasted Errors on everything.
Next
From: Sean Davis
Date:
Subject: Re: Import csv file into multiple tables in Postgres