On June 23, 2005 03:27 pm, David Bear wrote:
> I'm finding the \copy is very brittle. It seems to stop for everyone
> little reason. Is there a way to tell it to be more forgiving -- for
> example, to ignore extra data fields that might exists on a line?
>
> Or, to have it just skip that offending record but continue on to the
> next.
>
> I've got a tab delimited file, but if \copy sees any extra tabs in the
> file it just stops at that record. I want to be able to control what
> pg does when it hits an exception.
>
> I'm curious what others do for bulk data migration. Since copy seems
> so brittle, there must be a better way...
>
You may use '-d' option of pg_dump in which case it dumps data into INSERT statements.
In this case when you load the damped data it will process tabs properly and will fail any invalid records but finish
theprocess itself.
If you redirect output into a separate file you can analyze later how many records failed.
May be that's what you need in your case.
The only problem with this method I know is that it takes longer to load the data as it does full validation for each
record.
--
Vladimir Yevdokimov <vladimir@givex.com>