I know this topic has been rehashed a million times, but I just wanted to
add one datapoint. I have a database (150 tables, less than 20K tuples
in any one table) which I 'vacuum analyze'*HOURLY*, blocking all access,
and I still see frequent situations where my query times bloat by roughly
300% (4 times slower) in the intervening time between vacuums. All this
is to say that I think a more strategic implementation of the
functionality of vacuum analyze (specifically, non-batched, automated,
on-the-fly vacuuming/analyzing) would be a major "value add". I haven't
educated myself as to the history of it, but I do wonder why the
performance focus is not on this. I'd imagine it would be a performance
hit (which argues for making it optional), but I'd gladly take a 10%
performance hit over the current highly undesireable degradation. You
could do a whole lotta optimization on the planner/parser/executor and
not get close to the end-user-perceptible gains from fixing this
problem...
Regards,
Ed Loehr