Douglas Thomson <dougt@mugc.cc.monash.edu.au> writes:
> My question for the list: Is there something in the query optimiser
> that will take a time that is something like exponential on the number
> of tables being joined?
Yes. See the manual's discussion of genetic query optimization. You
might try setting the GEQO threshold to something less than 10.
FWIW, the time you are looking at here is strictly planning time and
will not increase when you actually put data in the tables. Unless
you foresee fairly small final tables, the planning time is not likely
to be a significant factor in this range of query sizes.
Still, if the purpose of the join is just to substitute descriptions
for IDs, you'd be well advised to consider doing it via table lookup
on the frontend side, assuming your frontend is coded in a language
that makes that reasonable to do. The Postgres optimizer cannot
help but waste many cycles per query reverse-engineering your intent,
even assuming that it ultimately comes up with the best possible
query plan for the join...
regards, tom lane