Well, you could always make a table that has no constraints on duplicates
and COPY TO that one. Then, make a query that inserts the data into your
production table that handles the duplicates.
Adam Lang
Systems Engineer
Rutgers Casualty Insurance Company
----- Original Message -----
From: "Matthew Kennedy" <mkennedy@hssinc.com>
To: <pgsql-general@postgresql.org>
Sent: Tuesday, August 29, 2000 10:15 AM
Subject: [GENERAL] Ignore when using COPY FROM
> I have a ton of data in a text delimited file from an old legacy system.
> When uploading it into postgres, I'd do something like this:
>
> COPY stuff FROM 'stuff.txt' USING DELIMITERS = '|';
>
> The problem is some of the rows in stuff.txt may not conform to the
> stuff table attributes (duplicate keys, for example). The command above
> terminates with no change to the table stuff when it encounters an error
> in stuff.txt. Is there a way to have postgres ignore erroneous fields
> but keep information about which fields weren't processed. I believe
> Oracle has some support for this through an IGNORE clause.