Re: A costing analysis tool - Mailing list pgsql-hackers

From Tom Lane
Subject Re: A costing analysis tool
Date
Msg-id 2887.1129413961@sss.pgh.pa.us
Whole thread Raw
In response to Re: A costing analysis tool  (Greg Stark <gsstark@mit.edu>)
Responses Re: A costing analysis tool
List pgsql-hackers
Greg Stark <gsstark@mit.edu> writes:
> If the optimizer didn't collapse the cost for each node into a single value
> and instead retained the individual parameters at each node it could bubble
> those values all the way up to the surface. Then use the configuration options
> like random_page_cost etc to calculate the resulting cost once.

Hardly --- how will you choose the best subplans if you don't calculate
their costs?

It might be possible to remember where the costs came from, but I'm
unconvinced that there's much gold to be mined that way.

I'm also a bit suspicious of the "it's all a linear equation" premise,
because the fact of the matter is that the cost estimates are already
nonlinear, and are likely to get more so rather than less so as we learn
more.  A case in point is that the reason nestloop costing sucks so
badly at the moment is that it fails to account for cache effects in
repeated scans ... which is definitely a nonlinear effect.
        regards, tom lane


pgsql-hackers by date:

Previous
From: Greg Stark
Date:
Subject: Re: A costing analysis tool
Next
From: Greg Stark
Date:
Subject: Re: slow IN() clause for many cases