Re: populate table with large csv file - Mailing list pgsql-general

From Ron Johnson
Subject Re: populate table with large csv file
Date
Msg-id 1064509077.1441.57.camel@haggis
Whole thread Raw
In response to populate table with large csv file  ("Dave [Hawk-Systems]" <dave@hawk-systems.com>)
List pgsql-general
On Thu, 2003-09-25 at 11:38, Dave [Hawk-Systems] wrote:
> have the table "numbercheck"
>      Attribute |    Type    | Modifier
>     -----------+------------+----------
>      svcnumber | integer    | not null
>      svcqual   | varchar(9) |
>      svcequip  | char(1)    |
>      svctroub  | varchar(6) |
>      svcrate   | varchar(4) |
>      svcclass  | char(1)    |
>      trailer   | varchar(3) |
>     Index: numbercheck_pkey
>
> also have a csv file
>     7057211380,Y,,,3,B
>     7057216800,Y,,,3,B
>     7057265038,Y,,,3,B
>     7057370261,Y,,,3,B
>     7057374613,Y,,,3,B
>     7057371832,Y,,,3,B
>     4166336554,Y,,,3,B
>     4166336863,Y,,,3,B
>     7057201148,Y,,,3,B
>
> aside from parsing the csv file through a PHP interface, what isthe easiest way
> to get that csv data importted into the postgres database. thoughts?

No matter what you do, it's going to barf: svcnumber is a 32-bit
integer, and 7,057,211,380 is significantly out of range.

Once you change svcnumber to bigint, the COPY command will easily
suck in the csv file.

--
-----------------------------------------------------------------
Ron Johnson, Jr. ron.l.johnson@cox.net
Jefferson, LA USA

"Python is executable pseudocode; Perl is executable line noise"


pgsql-general by date:

Previous
From: Richard Huxton
Date:
Subject: Re: Good way to insert/update when you're not sure of duplicates?
Next
From: Vivek Khera
Date:
Subject: Re: PostgreSQL at OSCON 2004