Thread: Query plan issues - volatile tables

Query plan issues - volatile tables

From
Brian Herlihy
Date:
Hi,

We have a problem with some of our query plans.  One of our tables is quite volatile, but postgres always uses the last
statisticssnapshot from the last time it was analyzed for query planning.  Is there a way to tell postgres that it
shouldnot trust the statistics for this table?  Basically we want it to assume that there may be 0, 1 or 100,000
entriescoming out from a query on that table at any time, and that it should not make any assumptions. 

Thanks,
Brian
 ========================
Brian Herlihy
Trellian Pty Ltd
+65 67534396 (Office)
+65 92720492 (Handphone)
========================


Re: Query plan issues - volatile tables

From
Craig James
Date:
Brian Herlihy wrote:
> We have a problem with some of our query plans.  One of our
>tables is quite volatile, but postgres always uses the last
>statistics snapshot from the last time it was analyzed for query
>planning.  Is there a way to tell postgres that it should not
>trust the statistics for this table?  Basically we want it to
>assume that there may be 0, 1 or 100,000 entries coming out from
>a query on that table at any time, and that it should not make
>any assumptions.>

I had a similar problem, and just changed my application to do an analyze either just before the query, or just after a
majorupdate to the table.  Analyze is very fast, almost always a orders of magnitude faster than the time lost to a
poorquery plan. 

Craig