Re: skipping records with duplicate key during COPY - Mailing list pgsql-novice

From Ludwig Lim
Subject Re: skipping records with duplicate key during COPY
Date
Msg-id 20021219022913.81319.qmail@web80310.mail.yahoo.com
Whole thread Raw
In response to Re: skipping records with duplicate key during COPY  (Ron Johnson <ron.l.johnson@cox.net>)
Responses Re: skipping records with duplicate key during COPY
List pgsql-novice
--- Ron Johnson <ron.l.johnson@cox.net> wrote:
> On Wed, 2002-12-18 at 13:16, Devinder K Rajput
> wrote:
> > Hi,
> >         I am using COPY command to load a data
> table.  When I try to
> > insert a record with a duplicate key, I get the
> error message "cannot
> > insert a duplicate key into unique index" and no
> data is loaded into
> > the table.  I know that this is the way the COPY
> command works.  Now,
> > is there a way in which I can load a table and if
> duplicate records do
> > come up, write them to an error file, but still
> load the other good
> > records into the table?  *I think* one of
> accomplishing this is by
> > performing inserts of individual records, but that
> would be very slow.
> > any suggestions?
>
> Any method of inserting records where there is a
> unique index will
> be slow, since the index must be checked and
> populated.
>
> Your idea of doing individual inserts (via C, Python
> or Perl) is
> a valid one, for the exact reason you state, and
> because input
> data is not always in COPY format...
>

  What about creating a BEFORE INSERT trigger that
will check for duplicate index. The trigger can insert
the duplicate  records into another table and  "RETURN
NULL" so that it will not insert into the table with
unique index.


ludwig.

__________________________________________________
Do you Yahoo!?
New DSL Internet Access from SBC & Yahoo!
http://sbc.yahoo.com

pgsql-novice by date:

Previous
From: Ron Johnson
Date:
Subject: Re: skipping records with duplicate key during COPY
Next
From: Ron Johnson
Date:
Subject: Re: skipping records with duplicate key during COPY