Hello Joanne,
Le Thu, 1 May 2003 06:02:38 -0700 (PDT)
Joanne Formoso <joanneformoso@yahoo.com> a écrit:
> I'm new to POSTGRESQL but me and my friend are trying
> to benchmark the database by populating one table with
> 100,000 records, affecting around 5 columns in one
> row. The problem is our processing time is very
> slow...around 30 minutes to 1 hour. There are times
> it just hangs and stop execution. I was just
> wondering if this was normal.
>
> We structured our PHP code in such a way that isn't
> optimized in order to isolate possible problems with
> the database. We also used indices to speed up the
> process from the database side. Any suggestions on
> how to optimize Postgre SQL through its config files?
> Thanks in advance!
Perhaps I don't well understand your question, but
I have some times (for test) populated a table of
46 rows and 1,348,215 tuples in 6 MINUTES using \copy
from a sequential file (on an AMD 900 MHz).
This take more than 50 HOURS if I do that by a "C" program
using libpq ... !
And don't create secondary indexes before the table was populated,
depending of number of indexes this can take ... some days.
Hope this help,
--
Alain Lucari (Eurlix)