Re: [PERFORM] Bad n_distinct estimation; hacks suggested? - Mailing list pgsql-hackers

From Josh Berkus
Subject Re: [PERFORM] Bad n_distinct estimation; hacks suggested?
Date
Msg-id 200504251213.18565.josh@agliodbs.com
Whole thread Raw
In response to Re: [PERFORM] Bad n_distinct estimation; hacks suggested?  (Simon Riggs <simon@2ndquadrant.com>)
Responses Re: [PERFORM] Bad n_distinct estimation; hacks suggested?
Re: [PERFORM] Bad n_distinct estimation; hacks suggested?
List pgsql-hackers
Simon, Tom:

While it's not possible to get accurate estimates from a fixed size sample, I
think it would be possible from a small but scalable sample: say, 0.1% of all
data pages on large tables, up to the limit of maintenance_work_mem.

Setting up these samples as a % of data pages, rather than a pure random sort,
makes this more feasable; for example, a 70GB table would only need to sample
about 9000 data pages (or 70MB).  Of course, larger samples would lead to
better accuracy, and this could be set through a revised GUC (i.e.,
maximum_sample_size, minimum_sample_size).

I just need a little help doing the math ... please?

--
--Josh

Josh Berkus
Aglio Database Solutions
San Francisco

pgsql-hackers by date:

Previous
From: Simon Riggs
Date:
Subject: Re: [PERFORM] Bad n_distinct estimation; hacks suggested?
Next
From: Josh Berkus
Date:
Subject: Re: [PERFORM] Bad n_distinct estimation; hacks suggested?