Ah, I didn't realize that you could just do an ANALYZE. I thought there was
only VACUUM ANALYZE but that can't run inside of a transaction.
Thanks,
rg
----- Original Message -----
From: "Alvaro Herrera Munoz" <alvherre@dcc.uchile.cl>
To: "Rick Gigger" <rick@alpinenetworking.com>
Cc: "Mike Mascari" <mascarm@mascari.com>; "PgSQL General ML"
<pgsql-general@postgresql.org>
Sent: Thursday, November 20, 2003 2:06 PM
Subject: Re: [GENERAL] performance problem
On Thu, Nov 20, 2003 at 01:52:10PM -0700, Rick Gigger wrote:
> I worked around this by starting the transaction and inserting the 45,000
> rows and then killing it. The I removed the index and readded it which
> apparently gathered some stats and since there were all of the dead tuples
> in there from the failed transaction it now decided that it should use the
> index. I reran the script and this time it took 5 minutes again instead
of
> 1 1/2 hours.
Stats are not collected automatically. You should run ANALYZE after
importing your data. And it's probably faster to create the index after
the data is loaded, too.
--
Alvaro Herrera (<alvherre[@]dcc.uchile.cl>)
Y una voz del caos me habló y me dijo
"Sonríe y sé feliz, podría ser peor".
Y sonreí. Y fui feliz.
Y fue peor.