On Sun, 2004-11-28 at 22:35, Tom Lane wrote:
> Simon Riggs <simon@2ndquadrant.com> writes:
> > Given we expect an underestimate, can we put in a correction factor
> > should the estimate get really low...sounds like we could end up
> > choosing nested joins more often when we should have chosen merge joins.
>
> One possibility: vacuum already knows how many tuples it removed. We
> could set reltuples equal to, say, the mean of the number-of-tuples-
> after-vacuuming and the number-of-tuples-before. In a steady state
> situation this would represent a fairly reasonable choice. In cases
> where the table size has actually decreased permanently, it'd take a few
> cycles of vacuuming before reltuples converges to the new value, but that
> doesn't seem too bad.
That sounds good to me. Covers all cases I can see from here.
> A standalone ANALYZE should still do what it does now, though, I think;
> namely set reltuples to its best estimate of the current value.
A GUC-free solution...but yet manual control is possible. Sounds good to
me - and for you Andreas, also?
--
Best Regards, Simon Riggs