Yeah, but he doesn't want to have to create the table before hand, he wants
postgresql to do all the work. ;)
The only way that I know of to do that is write a program and feed it the
CSV file... but in the program it is going to need to create the table.
Sometimes you just have to do the work yourself. :)
Adam Lang
Systems Engineer
Rutgers Casualty Insurance Company
http://www.rutgersinsurance.com
----- Original Message -----
From: "Lamar Owen" <lamar.owen@wgcr.org>
To: "mike" <mike@cyborg-group.com>
Cc: <pgsql-interfaces@postgresql.org>
Sent: Thursday, January 18, 2001 3:48 PM
Subject: Re: [INTERFACES] imorting csv files into postgres
> mike wrote:
>
> > Does anyone know if you can import csv text files into postgres
>
> A perl script to do that was just posted to the GENERAL List....
> Hmmmm....
>
> Tim Uckun just posted the following:
> -------
> Here is your steps.
> 1) Create your destination table in postgres
> 2) Export your data to a comma delimeted format
> 3) save this perl script to a file
>
> ------
> #! /usr/bin/perl
> $inquotes = 0;
> while (<>){
> # Chop the crlf
> chop ($_);
> chop ($_);
>
> # this first bit goes through and replaces
> # all the commas that re not in quotes with tildes
> for ($i=0 ; $i < length($_) ; $i++){
> $char=substr($_,$i,1);
> if ($char eq '"' ){
> $inquotes = not($inquotes);
> }else{
> if ( (!$inquotes) && ($char eq ",") ){
> substr($_,$i,1)="~";
> }
> }
> }
> # this replaces any quotes
> s/"//g;
> print "$_\n";
> }
> -----
>
> 4) type this cat myfile.txt | perl myprog.pl > outfile.dat
> 5) go to psql
> 6) COPY "tablename" FROM '/path/to/outfile.dat' USING DELIMITERS '~';
> -----
>
> See if that helps any.
> --
> Lamar Owen
> WGCR Internet Radio
> 1 Peter 4:11