Another good way to handle this is to put a trigger on the table that
diverts inserts that would fail to a holding table. While this will slow
down the inserts, it will allow you to insert large lists of dubious
quality and worry about the bad rows later.
My preference is to fix the data feed, or pre-process it with PHP/Perl to
split it into two files ahead of time, but I'm more of a coder than a dba.
I get a lot of data to import from other sources at work, and it's often
easier to make the sources fix their data feeds than it is to try and
massage them each and every time.
On Wed, 7 Jan 2004, Chris Travers wrote:
> Transactions are atomic. What you are asking to do violates the whole
> concept of a transaction.
>
> You can, however, do these inserts outside of the transaction block.
>
> Best Wishes,
> Chris Travers
> ----- Original Message -----
> From: "Chris Ochs" <chris@paymentonline.com>
> To: <pgsql-general@postgresql.org>
> Sent: Wednesday, January 07, 2004 7:52 AM
> Subject: [GENERAL] problems with transaction blocks
>
>
> > I want to do a series of inserts within a single transaction block, but
> with
> > postgresql if one insert fails, the whole block is aborted. Is there any
> > way to get around this behavior so that postgresql won't abort the entire
> > transaction if a single insert returns an error?
> >
> > Chris
> >
> >
> > ---------------------------(end of broadcast)---------------------------
> > TIP 1: subscribe and unsubscribe commands go to majordomo@postgresql.org
> >
> >
>
>
> ---------------------------(end of broadcast)---------------------------
> TIP 5: Have you checked our extensive FAQ?
>
> http://www.postgresql.org/docs/faqs/FAQ.html
>