Jan de Visser <jdevisser@digitalfairway.com> writes:
> On Monday 08 May 2006 14:10, Andrus wrote:
>> I created empty table konto and loaded more that 219 records to it during
>> database creation.
>> So it seems that if table grows from zero to more than 219 times larger
>> then it was still not processed.
> That's because you need at least 500 rows for analyze and 100 for a vacuum,
> (autovacuum_vacuum_threshold = 1000, autovacuum_analyze_threshold = 500).
This crystallizes something that's been bothering me for awhile,
actually: why do the "threshold" variables exist at all? If we took
them out, or at least made their default values zero, then the autovac
criteria would simply be "vacuum or analyze if at least X% of the table
has changed" (where X is set by the "scale_factor" variables). Which
seems intuitively reasonable. As it stands, the thresholds seem to bias
autovac against ever touching small tables at all ... but, as this
example demonstrates, a fairly small table can still kill your query
performance if the planner knows nothing about it.
regards, tom lane