Some CSV file-import questions - Mailing list pgsql-novice

From Ron Johnson
Subject Some CSV file-import questions
Date
Msg-id 1021806087.9171.18.camel@rebel
Whole thread Raw
Responses Re: Some CSV file-import questions
List pgsql-novice
Hello,

If the csv file generated by the application that I am
importing from contains quotes around each field, must I
write a program to strip these "field-level" quotes before
sending the file to COPY?

As we know, COPY is a single transaction.  Therefore, it
would be "unpleasant" if, say, the process that is doing the
importing dies 90% of the way through a 10,000,000 row table.
Is there a checkpoint mechanism, that, would do a COMMIT, for
example, every 10,000 rows.  Then, if the process that is doing
the importing does 90% of the way through that 10,000,000 row
table, when you restart the COPY, it skips over the inserted
rows.
Here is an example from the RDBMS that I currently use:
$ bulkload -load -log -commit=10000 -tran=exclusive -db=test \
     -table=foo ~/foo.csv
Then, if something happens after inserting 9,000,000 rows,
it can be restarted by:
$ bulkload -load -log -commit=10000 -skip=9000000 -db=test \
     -tran=exclusive -table=foo ~/foo.csv

From what I've seen in the documentation, and the mailing
list archives, the solution to both of these questions is
to roll my bulk loader.

Ron
--
+---------------------------------------------------------+
| Ron Johnson, Jr.        Home: ron.l.johnson@cox.net     |
| Jefferson, LA  USA      http://ronandheather.dhs.org:81 |
|                                                         |
| "I have created a government of whirled peas..."        |
|   Maharishi Mahesh Yogi, 12-May-2002,                   |
!   CNN, Larry King Live                                  |
+---------------------------------------------------------+


pgsql-novice by date:

Previous
From: thiemo
Date:
Subject: Forgotten the master password of db
Next
From: Andrew McMillan
Date:
Subject: Re: Some CSV file-import questions