Re: [HACKERS] [PATCH] Overestimated filter cost and its mitigation - Mailing list pgsql-hackers

From Yuto Hayamizu
Subject Re: [HACKERS] [PATCH] Overestimated filter cost and its mitigation
Date
Msg-id CANE+7D9CJZc4RsJqq5oC7TptDyVPbrvex618deprGknkgsMqqA@mail.gmail.com
Whole thread Raw
In response to Re: [HACKERS] [PATCH] Overestimated filter cost and its mitigation  (Yuto Hayamizu <y.hayamizu@gmail.com>)
Responses Re: [HACKERS] [PATCH] Overestimated filter cost and its mitigation
List pgsql-hackers
On Fri, Jan 19, 2018 at 5:07 PM, Yuto Hayamizu <y.hayamizu@gmail.com> wrote:
> My idea of improving this patch is that give a threshold N_limit,
> and for q_1 ... q_N_limit, do the same weighted cost estimation in the
> current version of this patch.
> For q_{N_limit+1} ...., stop calling clauselist_selectivity for
> calculating the weight
> and reuse the result of clauselist_selectivity({q_1,q_2, ..., q_N_limit}).
> For example, if N_limit=100, additional overhead is only
> sub-milliseconds per each range table entry,
> and cost estimation is surely better than the current postgres implementation.

Attached patch implemented the improvement idea above.
With this patch attached, performance degradation of the test query
with many quals was <1%.
Example test query is attached.

regards,

----
Yuto Hayamizu

Attachment

pgsql-hackers by date:

Previous
From: Craig Ringer
Date:
Subject: Re: Linking PostgreSQL as a C++ program
Next
From: Ashutosh Bapat
Date:
Subject: Re: CREATE ROUTINE MAPPING