Re: Ragged CSV import - Mailing list pgsql-hackers

From Nikhil Sontakke
Subject Re: Ragged CSV import
Date
Msg-id a301bfd90909100138j628fcfb2oc2fb3a2bbe002fda@mail.gmail.com
Whole thread Raw
In response to Re: Ragged CSV import  ("Kevin Grittner" <Kevin.Grittner@wicourts.gov>)
List pgsql-hackers
Hi,

> the two most
> useful are to read in only some of the defined columns, and to output
> to
> a separate disk file any rows which failed to match the expected
> format.
> The latter would not cause the copy to fail unless the count of such
> rows exceeded a user-specified threshold.
>

+1

Allowing the capability to handle rows that might get discarded due to
constraint violations, bad column inputs etc. sounds like a big help
while doing large copy operations.

Another capability would be to transform the input column via some sql
expressions before loading it into the table. Given the way update
works, this could avoid the unnecessary subsequent bloat to fine-tune
some of the columns.

Regards,
Nikhils
-- 
http://www.enterprisedb.com


pgsql-hackers by date:

Previous
From: Peter Eisentraut
Date:
Subject: Re: Ragged CSV import
Next
From: Dimitri Fontaine
Date:
Subject: Re: Ragged CSV import