When to bump up statistics? - Mailing list pgsql-performance

From Dawid Kuroczko
Subject When to bump up statistics?
Date
Msg-id 758d5e7f04111905594cf9c9f8@mail.gmail.com
Whole thread Raw
Responses Re: When to bump up statistics?  (Josh Berkus <josh@agliodbs.com>)
List pgsql-performance
ALTER TABLE foo ALTER COLUMN bar SET STATISTICS n; .....

I wonder what are the implications of using this statement,
I know by using, say n=100, ANALYZE will take more time,
pg_statistics will be bigger, planner will take longer time,
on the other hand it will make better decisions... Etc, etc.

I wonder however when it is most uselful to bump it up.
Please tell me what you think about it:

Is bumping up statistics is only useful for indexed columns?

When is it most useful/benefitial to bump them up:

1) huge table with huge number of distinct values (_almost_
    unique ;))

2) huge table with relatively equally distributed values
    (like each value is in between, say, 30-50 rows).

3) huge table with unequally distributed values (some
    values are in 1-5 rows, some are in 1000-5000 rows).

4) huge table with small number values (around ~100
    distinct values, equally or uneqally distributed).

5) boolean column.

I think SET STATISTICS 100 is very useful for case with
unequally distributed values, but I wonder what about
the other cases.  And as a side note -- what are the
reasonable bounds for statistics (between 10 and 100?)

What are the runtime implications of setting statistics
too large -- how much can it affect queries?

And finally -- how other RDBMS and RDBM-likes deal
with this issue? :)

   Regards,
      Dawid

pgsql-performance by date:

Previous
From: Sean Chittenden
Date:
Subject: Re: memcached and PostgreSQL
Next
From: Jan Wieck
Date:
Subject: Re: sort_mem affect on inserts?