Re: populate table with large csv file - Mailing list pgsql-general

From Dave [Hawk-Systems]
Subject Re: populate table with large csv file
Date
Msg-id DBEIKNMKGOBGNDHAAKGNIENDFBAC.dave@hawk-systems.com
Whole thread Raw
In response to Re: populate table with large csv file  ("P.J. \"Josh\" Rovero" <rovero@sonalysts.com>)
Responses Re: populate table with large csv file
List pgsql-general
>> aside from parsing the csv file through a PHP interface, what isthe
>easiest way
>> to get that csv data importted into the postgres database. thoughts?
>
>Assuming the CSV file data is well formed, use psql and
>the COPY command.
>
>In psql, create the table.  Then issue command:
>
>copy <tablename> from 'filename' using delimiters ',';

perfect solution that was overlooked.

Unfortunately processing the 143mb file which would result in a database size of
approx 500mb takes an eternity.  As luck would have it we can get away with just
dropping to an exec and doing a cat/grep for any data we need...  takes 2-3
seconds.

the copy command is definately a keeper as I am not looking at replacing code
elsewhere with a simpler model using that.

Thanks

Dave



pgsql-general by date:

Previous
From: "Nigel J. Andrews"
Date:
Subject: Re:
Next
From: Curtis Stanford
Date:
Subject: Re: Good way to insert/update when you're not sure of duplicates?