Good afternoon,
I created a database with Postgres 7.3.4 under Linux RedHat 7.3 on a
Dell PowerEdge server.
One of the table is
resultats(numbil, numpara, mesure, deviation)
with an index on numbil.
Each select on numbil returns up to 60 rows (that means 60 rows for
one numbil with 60 different numpara) for example
(200000,1,500,3.5)
(200000,2,852,4.2)
(200000,12,325,2.8)
(200001,1,750,1.5)
(200001,2,325,-1.5)
(200001,8,328,1.2)
etc..
This table contains now more than 6.500.000 rows and grows from
6000 rows a day. I have approximatively 1.250.000 rows a year. So I
have 5 years of data online.
Now, an insertion of 6000 lasts very loooong, up to one hour...
I tried to insert 100.000 yesterday evening and it was not done in 8
hours.
Do you have any idea how I can improve speed - apart from splitting
the table every 2 or 3 years which is the the aim of a database!
I thank you for your suggestions.
Regards.
Alain Reymond
CEIA
Bd Saint-Michel 119
1040 Bruxelles
Tel: +32 2 736 04 58
Fax: +32 2 736 58 02
alain.reymond@ceia.com
PGP key sur http://pgpkeys.mit.edu:11371