Re: duplicates - Mailing list pgsql-admin

From Andrei Bintintan
Subject Re: duplicates
Date
Msg-id 008601c493e5$e0965940$0b00a8c0@forge
Whole thread Raw
In response to duplicates  (Tsirkin Evgeny <tsurkin@mail.jct.ac.il>)
Responses Re: duplicates  (Tsirkin Evgeny <tsurkin@mail.jct.ac.il>)
List pgsql-admin
I am not sure that I understand clearly your problem. Are you sure that your
query's are written correctly?

For duplicates you can make a uniqe indexing so this will avoid any
dupplicates in your table.
CREATE UNIQUE INDEX table_column_uniqueidx ON table(column);

If the rows are dissapearing, please check your delete query, because that
is the only way to erase all rows from the table.

If still got problems, please post some queries, be more specific.

Best regards,
Andy.

----- Original Message -----
From: "Tsirkin Evgeny" <tsurkin@mail.jct.ac.il>
To: <pgsql-admin@postgresql.org>
Sent: Monday, September 06, 2004 9:33 AM
Subject: [ADMIN] duplicates


> Hello dear list!
> Here is the problem i have:
> i am using 7.3.4 postgres .i have an aplication that updating 2 tables.
> while it needs to update something - it does not select the rows that
> are already in the table search what it needs to update and execute
> an 'update' query .in place it deletes all the old rows and inserts the
> new one.However while we have a havy load we got a duplicate rows in the
> table ,althought we use transaction an both delete and the new insert
> are in the same transaction.We are pretty sure there is no bug in
> the applicatioin that inserts the data more then once .
> Is that a known problem ?What could be the problem?
>
> --
> Evgeny.
>
> ---------------------------(end of broadcast)---------------------------
> TIP 3: if posting/reading through Usenet, please send an appropriate
>       subscribe-nomail command to majordomo@postgresql.org so that your
>       message can get through to the mailing list cleanly


pgsql-admin by date:

Previous
From: Tsirkin Evgeny
Date:
Subject: duplicates
Next
From: Tsirkin Evgeny
Date:
Subject: Re: duplicates