Re: Avoiding duplicates (or at least marking them as such) in a "cumulative" transaction table. - Mailing list pgsql-general

From Scott Marlowe
Subject Re: Avoiding duplicates (or at least marking them as such) in a "cumulative" transaction table.
Date
Msg-id dcc563d11003071849r59fcf197p8b65e99202ad10f7@mail.gmail.com
Whole thread Raw
In response to Avoiding duplicates (or at least marking them as such) in a "cumulative" transaction table.  (Allan Kamau <kamauallan@gmail.com>)
Responses Re: Avoiding duplicates (or at least marking them as such) in a "cumulative" transaction table.  (Allan Kamau <kamauallan@gmail.com>)
List pgsql-general
On Sun, Mar 7, 2010 at 1:45 AM, Allan Kamau <kamauallan@gmail.com> wrote:
> Hi,
> I am looking for an efficient and effective solution to eliminate
> duplicates in a continuously updated "cumulative" transaction table
> (no deletions are envisioned as all non-redundant records are
> important). Below is my situation.

Is there a reason you can't use a unique index and detect failed
inserts and reject them?

pgsql-general by date:

Previous
From: Scott Marlowe
Date:
Subject: Re: Transaction wraparound problem with database postgres
Next
From: Dmitry Litvintsev
Date:
Subject: psql uses default user from kerberos ticket not the current user