I've been discussing this with Josh, Heikki, and Peter E. over the past
few weeks.
As Peter observed years ago, prepared statements can perform badly
because their plans are overly generic. Also, statistics change and
sometimes plans should change with them. It would be nice if we could
avoid users having to worry about these things.
I have some ideas that I'm willing to work on, if people agree that
they're useful. These are simple changes; the goal is to avoid
pathological performance at minimal cost, not to make prepared
statements faster than parameterized ones for everyone. The ideas
interact in various ways.
= Projected-cost threshold =
If a prepared statement takes parameters, and the generic plan has a
high projected cost, re-plan each EXECUTE individually with all its
parameter values bound. It may or may not help, but unless the planner
is vastly over-pessimistic, re-planning isn't going to dominate
execution time for these cases anyway.
= Actual-cost threshold =
Also stop using the generic plan if the statement takes a long time to
run in practice. Statistics may have gone bad. It could also be a
one-off due to a load peak or something, but that's handled by:
= Plan refresh =
Periodically re-plan prepared statements on EXECUTE. This is also a
chance for queries that were being re-planned every time to go back to a
generic plan.
Does that sound useful? Maybe it's too much engineering for little gain
compared to re-planning every EXECUTE. OTOH it doesn't look that much
harder than going that way either. And maybe there's some extra
planning effort that might be worthwhile for a reusable plan but not for
an individual query.
Jeroen