I had thought that I had dropped and reloaded this
table but apparently I hadn't and I had set the
statistics target for one column to 500 while
experimenting. Resetting it to -1 and running with a
default of 300 gets ~ 70 megs memory footprint during
the analyze now.
Thanks Tom for indulging my curiosity on the matter.
I've learned something that I didn't readily pick up
from reading the documentation.
Regards,
Shelby Cain
--- Tom Lane <tgl@sss.pgh.pa.us> wrote:
> Shelby Cain <alyandon@yahoo.com> writes:
> > It still decided to sample 150000 rows. Am I
> missing
> > something obvious here? Shouldn't fewer rows be
> > sampled when I set the collection target to 1?
>
> The sample size is 300 rows times the largest
> per-column analysis
> target, where default_statistics_target is used if
> the recorded
> per-column setting is -1. I would say that you have
> set a target of 500
> for at least one of the columns of that table, using
> ALTER TABLE SET
> STATISTICS. Try this to see which:
>
> select attname, attstattarget from pg_attribute
> where attrelid = 'table_name_here'::regclass;
>
> regards, tom lane
__________________________________
Do you Yahoo!?
Yahoo! Finance Tax Center - File online. File on time.
http://taxes.yahoo.com/filing.html