Re: Bulkloading using COPY - ignore duplicates? - Mailing list pgsql-hackers

From Zeugswetter Andreas SB SD
Subject Re: Bulkloading using COPY - ignore duplicates?
Date
Msg-id 46C15C39FEB2C44BA555E356FBCD6FA41EB3A0@m0114.s-mxs.net
Whole thread Raw
In response to Bulkloading using COPY - ignore duplicates?  (Lee Kindness <lkindness@csl.co.uk>)
Responses Re: Bulkloading using COPY - ignore duplicates?
List pgsql-hackers
> IMHO, you should copy into a temporary table and the do a select 
> distinct from it into the table that you want.

Which would be way too slow for normal operation :-(
We are talking about a "fast as possible" data load from a flat file
that may have duplicates (or even data errors, but that 
is another issue).

Andreas


pgsql-hackers by date:

Previous
From: Oleg Bartunov
Date:
Subject: cvs problem
Next
From: "Marc G. Fournier"
Date:
Subject: Re: Preparation for Beta