Thread: Rép. : How to opimize the insertion of data

Rép. : How to opimize the insertion of data

From
"Erwan DUROSELLE"
Date:
Did you batch your inserts in a single transaction?, ie:
BEGIN
insert ...;
insert ...;
...
Commit

This is much faster (x17 in my simple test) than without the begin/end,
which will result in 1 transaction per insert.

Erwan


>>> Juan Francisco Diaz <j-diaz@publicar.com> 09/09/2003 23:05:54 >>>
Hi,  have tried by all means to optimize the insertion of data in my db
but
it has been impossible.
Righto now to insert around 300 thou records it takes soemthing like 50
to
65 minutes (too  long).
Im using a Mac powerpc g4 533Mhz with 256 RAM.
I would relly appreciate that the insertion process is done in like 30
or 35
minutes TOPS. So far it is impossible.
My db right now has no FK, no indexes, the insertions is being done in
batch
(19 thou records).
Is it possible with my current machine to achieve the level of
performance
i've metioned?
Any help would be greatly appreciated, by the way the same insertion
takes
25 mins in ms sqlserver2000 in a p3 1.4ghz 1gig ram.
Thanks

JuanF


---------------------------(end of
broadcast)---------------------------
TIP 5: Have you checked our extensive FAQ?

               http://www.postgresql.org/docs/faqs/FAQ.html

Re: R駱.

From
Stuart Woodward
Date:
On Wed, 10 Sep 2003 09:50:02 +0200
"Erwan DUROSELLE" <EDuroselle@seafrance.fr> wrote:

> This is much faster (x17 in my simple test) than without the begin/end,
> which will result in 1 transaction per insert.

I have to agree that is is the way to go. One of my colleagues went as
far as writing a Perl script that inserted the transaction code around X
number of operations. (Sorry I don't have that script now.)

What I would like to know is, how can you determine the optimum number
of lines to surround with a transaction? I guess at some point Postgres
will run out of memory or disk space if too many operations are done
within a single transaction.

It would be nice to have a rule of thumb to calculate roughly how many
operations to group together but I guess that this is almost impossible
to calculate.

Also, the bigger the dump the harder it is to edit by hand so you may
need to develop a script to do this at some point.