On Wed, 28 May 2003, Dave Tenny wrote:
> Having grepped the web, it's clear that this isn't the first or last
> time this issue will be raised.
>
> My application relies heavily on IN lists. The lists are primarily
> constant integers, so queries look like:
>
> SELECT val FROM table WHERE id IN (43, 49, 1001, 100002, ...)
>
> Performance is critical, and the size of these lists depends a lot on
> how the larger 3-tier applicaiton is used,
> but it wouldn't be out of the question to retrieve 3000-10000 items.
>
> PostgreSQL 7.3.2 seems to have a lot of trouble with large lists.
It gets converted into a sequence like col=list[0] or col=list[1] and it
seems the planner/optimizer is taking at least a large amount of time for
me given that explain takes just over 80 seconds for a 9900 item list on
my machine (I don't have a data filled table to run the actual query
against).
The best plan may be generated right now from making a temporary
table, copying the values into it, and joining.
> 2) What is the expected acceptable limit for the number of items in an
> IN list predicate such as
> those used here. (List of constants, not subselects).
As a note, 7.4 by default seems to limit it to 10000 unless you up
max_expr_depth afaics.