I have a table that I populate with a stored procedure. When the
stored procedure runs it deletes the table and rebuilds the whole
thing from scratch. Initially it performs terribly but when I play
around with it for a while (I will describe this in a moment) it runs
very, very fast.
If I understand correctly when the stored procedure is first created
it creates query plans for each of the queries in the stored
procedure and caches them. I don't know what order loading the dump
does everything in but I'm guessing that when the stored proc is
created the stats (or lack of stats) present at that time are causing
it to create bad query plans.
So I have two questions:
1) How can I verify this? Is there any way to do an explain on
something running in a stored procedure? (I am using pl/pgsql)
2) What can I do about it? What actually fixes it is to run it a
while and made sure there is a bunch of data there (even if it is
deleted and not visible to anything) and run vacuum analyze. Then
recreate the stored procedure. Then run the stored procedure. Then
it goes very, very fast. So my only option now is to create the
database from the dump and then fix the stored proc manually. I
guess I could run a script to do this. Is there a way that I can
delay the creation of query plans till the database has some actual
data available to it to actually create good plans?