On 7/8/22 03:07, Tom Lane wrote:
> Andrey Lepikhov <a.lepikhov@postgrespro.ru> writes:
>> On 12/8/21 04:26, Tomas Vondra wrote:
>>> I wonder if we should teach clauselist_selectivity about UNIQUE indexes,
>>> and improve the cardinality estimates directly, not just costing for
>>> index scans.
>
>> I tried to implement this in different ways. But it causes additional
>> overhead and code complexity - analyzing a list of indexes and match
>> clauses of each index with input clauses in each selectivity estimation.
>> I don't like that way and propose a new patch in attachment.
>
> I looked at this briefly. I do not think that messing with
> btcostestimate/genericcostestimate is the right response at all.
> The problem can be demonstrated with no index whatever, as in the
> attached shortened version of the original example. I get
I partly agree with you. Yes, I see the problem too. But also we have a
problem that I described above: optimizer don't choose a path with
minimal selectivity from a set selectivities which shows cardinality
less than 1 (see badestimate2.sql).
New patch (see in attachment), fixes this problem.
--
Regards
Andrey Lepikhov
Postgres Professional