On 1/25/17 12:43, Simon Riggs wrote:
> On 25 January 2017 at 17:34, Julian Markwort
> <julian.markwort@uni-muenster.de> wrote:
>
>> Analogous to this, a bad_plan is saved, when the time has been exceeded by a
>> factor greater than 1.1 .
> ...and the plan differs?
>
> Probably best to use some stat math to calculate deviation, rather than fixed %.
Yeah, it seems to me too that this needs a bit more deeper analysis. I
don't see offhand why a 10% deviation in execution time would be a
reasonable threshold for "good" or "bad". A deviation approach like you
allude to would be better.
The other problem is that this measures execution time, which can vary
for reasons other than plan. I would have expected that the cost
numbers are tracked somehow.
There is also the issue of generic vs specific plans, which this
approach might be papering over.
Needs more thought.
--
Peter Eisentraut http://www.2ndQuadrant.com/
PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services