Re: Better way to bulk-load millions of CSV records into - Mailing list pgsql-novice

From Ron Johnson
Subject Re: Better way to bulk-load millions of CSV records into
Date
Msg-id 1022093505.19121.48.camel@rebel
Whole thread Raw
In response to Re: Better way to bulk-load millions of CSV records into  (Marc Spitzer <marc@oscar.eng.cv.net>)
Responses Re: Better way to bulk-load millions of CSV records into
List pgsql-novice
On Wed, 2002-05-22 at 13:11, Marc Spitzer wrote:
> On Wed, May 22, 2002 at 12:48:58PM -0500, Ron Johnson wrote:
> > On Wed, 2002-05-22 at 11:18, Marc Spitzer wrote:
> > > On Wed, May 22, 2002 at 09:19:31AM -0500, Tom Sheehan wrote:
[snip]
> for i in load_data/* ;do
> echo "datafile $i"
> awk -F, 'BEGIN{OFS=","}{if ($15~/[.]/){$15="-1"; $0=$0} print $0}' $i >$i.tmp
> mv $i.tmp $i
> grep -E "[0-9]+([.][0-9]+)+" $i
> grep -vE "[0-9]+([.][0-9]+)+" $i >$i.tmp
> mv $i.tmp $i
> echo "copy call_me_bob from '/home/marc/projects/bobs_house/$i' using Delimiters ',' with null $
> done
[snip]

I'm not an awk programmer.  What does that command do?

Also, all my fields have double-quotes around them.  Is there
a tool (or really clever use of sed) that will strip them
away from the fields that don't need them?  I actually have
_comma_ delimited files, and any fields with commas in them
need the double quotes...

--
+---------------------------------------------------------+
| Ron Johnson, Jr.        Home: ron.l.johnson@cox.net     |
| Jefferson, LA  USA      http://ronandheather.dhs.org:81 |
|                                                         |
| "I have created a government of whirled peas..."        |
|   Maharishi Mahesh Yogi, 12-May-2002,                   |
!   CNN, Larry King Live                                  |
+---------------------------------------------------------+


pgsql-novice by date:

Previous
From: "Phillip J. Allen"
Date:
Subject: How to Identify a SERIAL column type?
Next
From: John Taylor
Date:
Subject: Re: optimising data load