Re: how to make duplicate finding query faster? - Mailing list pgsql-admin

From Holger Jakobs
Subject Re: how to make duplicate finding query faster?
Date
Msg-id 5bf642e4-eb22-8eba-e96d-5b33b5010b03@jakobs.com
Whole thread Raw
In response to how to make duplicate finding query faster?  (Sachin Kumar <sachinkumaras@gmail.com>)
Responses Re: how to make duplicate finding query faster?  (Sachin Kumar <sachinkumaras@gmail.com>)
List pgsql-admin
Am 30.12.20 um 08:36 schrieb Sachin Kumar:
Hi All,

I am uploading data into PostgreSQL using the CSV file and checking if there is any duplicates value in DB it should return a duplicate error.  I am using below mention query.

if Card_Bank.objects.filter( Q(ACCOUNT_NUMBER=card_number) ).exists():
        flag=2
      else:
        flag=1
it is taking too much time i am using 600k cards in CSV.

Kindly help me in making the query faster.

I am using Python, Django & PostgreSQL.
--

Best Regards,
Sachin Kumar

I think it would be easier to not check the duplicates before, but let the DB complain about duplicates.

That would about slash the roundtrips to the DB in half. Instead of check + insert there would be only an insert, which might fail every now and then.

Regards,

Holger

-- 
Holger Jakobs, Bergisch Gladbach, Tel. +49-178-9759012
Attachment

pgsql-admin by date:

Previous
From: Sachin Kumar
Date:
Subject: how to make duplicate finding query faster?
Next
From: Sachin Kumar
Date:
Subject: Re: how to make duplicate finding query faster?