Re: INSERT performance deteriorates quickly during a large import - Mailing list pgsql-general

From Krasimir Hristozov \(InterMedia Ltd\)
Subject Re: INSERT performance deteriorates quickly during a large import
Date
Msg-id 007801c822ec$e6f4faa0$0400000a@imediadev.com
Whole thread Raw
In response to INSERT performance deteriorates quickly during a large import  ("Krasimir Hristozov \(InterMedia Ltd\)" <krasi@imedia-dev.com>)
List pgsql-general
Thanks to all who responded. Using COPY instead of INSERT really solved the problem - the whole process took about 1h 20min on an indexed table, with constraints (which is close to our initial expectations). We're performing some additional tests now. I'll post some more observations when finished.
----- Original Message -----
Sent: Friday, November 09, 2007 1:52 PM
Subject: Re: INSERT performance deteriorates quickly during a large import

Hello Krasimir,

You got a lot of good advices above and I would like to add another one:

d) Make sure of your PHP code is not recursive. As you said the memory is stable so I think your method is iterative.
A recursive method certainly will increase a little time for each insert using more memory.
But iterative methods must be correctly to be called just once and maybe your code is running much more than need.

Pay attention on Tomas advices, and after that (I agree with Cris) "there should be no reason for loading data to get more costly as
the size of the table increases" - Please check your code.

I did some experiences long time ago with 40000 data with a lot of BLOBs. I used PHP code using SELECT/INSERT from Postgres to Postgres and the time wasn't constant but wasn't so bad as your case.  (And I didn't the Tomas a, b and c advices)

Good Luck
--
Márcio Geovani Jasinski

pgsql-general by date:

Previous
From: "Albe Laurenz"
Date:
Subject: Re: "Resurrected" data files - problem?
Next
From: Ted Byers
Date:
Subject: Re: Optimal time series sampling.