Re: prevent duplicate entries - Mailing list pgsql-novice

From David G Johnston
Subject Re: prevent duplicate entries
Date
Msg-id 1401372219275-5805419.post@n5.nabble.com
Whole thread Raw
In response to Re: prevent duplicate entries  (amul sul <sul_amul@yahoo.co.in>)
List pgsql-novice
amulsul wrote
> On Thursday, 29 May 2014 3:20 PM, Thomas Drebert <

> drebert@

> > wrote:
>
>
>>Has postgresql a separate function to prevent duplicate records?
>
>>At time i filter records in php.
>
> you can directly load csv file date on postgres database using
> pg_bulkload, which has functionality to avoid duplication 
>
> pg_bulkload : http://pgbulkload.projects.pgfoundry.org/pg_bulkload.html
>
> Is this answer to your question?
>
> Regards,
> Amul Sul

You might find it better to just load the CSV data into a staging table then
perform the necessary "INSERT INTO live ... SELECT ... FROM staging" query
to migrate only the new data.

It likely will not make much sense to accept (say 90%) of your data eating
resources generating duplicate key errors.

David J.



--
View this message in context: http://postgresql.1045698.n5.nabble.com/prevent-duplicate-entries-tp5805373p5805419.html
Sent from the PostgreSQL - novice mailing list archive at Nabble.com.


pgsql-novice by date:

Previous
From: amul sul
Date:
Subject: Re: prevent duplicate entries
Next
From: "lmanorders"
Date:
Subject: INSERT INTO FROM SELECT