Antoine Bajolet <antoine.bajolet@free.fr> writes:
> We are using postgresql in a search engine on an intranet handling
> throusand of documents.
> But we ave a big problem when users use more than two search key.
I think you need to increase the statistics targets for your keywords
table --- the estimates of numbers of matching rows are much too small:
> -> Index Scan using keyword_pattern_key on keywords
> k2 (cost=0.00..3.51 rows=1 width=4) (actual time=0.078..1.887 rows=75
> loops=1)
> Index Cond: (((keyword)::text ~>=~
> 'exploitation'::character varying) AND ((keyword)::text ~<~
> 'exploitatioo'::character varying))
> Filter: ((keyword)::text ~~ 'exploitation%'::text)
A factor-of-75 error is quite likely to mislead the planner into
choosing a bad join plan.
BTW, have you looked into using a real full-text-search engine (eg,
tsearch2) instead of rolling your own like this?
regards, tom lane