On Mon, 2005-04-25 at 11:23 -0400, Tom Lane wrote:
> Simon Riggs <simon@2ndquadrant.com> writes:
> > My suggested hack for PostgreSQL is to have an option to *not* sample,
> > just to scan the whole table and find n_distinct accurately.
> > ...
> > What price a single scan of a table, however large, when incorrect
> > statistics could force scans and sorts to occur when they aren't
> > actually needed ?
>
> It's not just the scan --- you also have to sort, or something like
> that, if you want to count distinct values. I doubt anyone is really
> going to consider this a feasible answer for large tables.
Assuming you don't use the HashAgg plan, which seems very appropriate
for the task? (...but I understand the plan otherwise).
If that was the issue, then why not keep scanning until you've used up
maintenance_work_mem with hash buckets, then stop and report the result.
The problem is if you don't do the sort once for statistics collection
you might accidentally choose plans that force sorts on that table. I'd
rather do it once...
The other alternative is to allow an ALTER TABLE command to set
statistics manually, but I think I can guess what you'll say to that!
Best Regards, Simon Riggs