On Mon, 2005-10-10 at 15:14 -0500, Kevin Grittner wrote:
> We are looking at doing much more with PostgreSQL over the
> next two years, and it seems likely that this issue will come up
> again where it is more of a problem. It sounded like there was
> some agreement on HOW this was to be fixed, yet I don't see
> any mention of doing it in the TODO list.
> Is there any sort of
> estimate for how much programming work would be involved?
The main work here is actually performance testing, not programming. The
cost model is built around an understanding of the timings and costs
involved in the execution.
Once we have timings to cover a sufficiently large range of cases, we
can derive the cost model. Once derived, we can program it. Discussing
improvements to the cost model without test results is never likely to
convince people. Everybody knows the cost models can be improved, the
only question is in what cases? and in what ways?
So deriving the cost model needs lots of trustworthy test results that
can be assessed and discussed, so we know how to improve things. [...and
I don't mean 5 minutes with pg_bench...]
Detailed analysis such as that is time consuming and also needs to be
done in a sufficiently reproducible manner that we can rely on it.
Your help would be greatly appreciated in that area. You and your team
clearly have an eye for the fine detail of these issues.
...IIRC there is a TODO item relating to that.
Perhaps we should put a more general call out on the TODO list towards
detailed, complete, accurate and reproducible performance test results?
Best Regards, Simon Riggs