Thread: BUG #7591: Large query string has limit

BUG #7591: Large query string has limit

From
pking@nrcan.gc.ca
Date:
The following bug has been logged on the website:

Bug reference:      7591
Logged by:          Pat King
Email address:      pking@nrcan.gc.ca
PostgreSQL version: 9.1.3
Operating system:   RedHat Linux 6.2
Description:        =


Hi :

I am dealing with the open source program called "Geonetwork". It is passing
the following  query to postgres/postgis :

SELECT "fid", asText("the_geom") FROM "public"."spatialindex" WHERE ("fid" =
=3D
'10') OR ("fid" =3D '100') OR ("fid" =3D '1000') OR ("fid" =3D '10001') OR =
("fid"
=3D '10005') ..........

Unfortunately, the query has > 50,000 OR clauses, and the query hangs.

I tried the query for < 50,000 OR clauses with success.

The spatialindex table is :

geonetwork=3D# \d spatialindex
          Table "public.spatialindex"
  Column  |          Type          | Modifiers
----------+------------------------+-----------
 fid      | integer                | not null
 id       | character varying(250) |
 the_geom | geometry               |
Indexes:
    "spatialindex_pkey" PRIMARY KEY, btree (fid)
    "spatialindexndx1" btree (id)
    "spatialindexndx2" gist (the_geom)
Check constraints:
    "enforce_dims_the_geom" CHECK (st_ndims(the_geom) =3D 2)
    "enforce_geotype_the_geom" CHECK (geometrytype(the_geom) =3D
'MULTIPOLYGON'::text OR the_geom IS NULL)
    "enforce_srid_the_geom" CHECK (st_srid(the_geom) =3D 4326)

Re: BUG #7591: Large query string has limit

From
Tom Lane
Date:
pking@nrcan.gc.ca writes:
> I am dealing with the open source program called "Geonetwork". It is passing
> the following  query to postgres/postgis :

> SELECT "fid", asText("the_geom") FROM "public"."spatialindex" WHERE ("fid" =
> '10') OR ("fid" = '100') OR ("fid" = '1000') OR ("fid" = '10001') OR ("fid"
> = '10005') ..........

> Unfortunately, the query has > 50,000 OR clauses, and the query hangs.

[ rolls eyes ... ]  You will not find a lot of DBMSes that don't have
indigestion with that.  It's possible that converting to an IN list
would be less inefficient, but personally I'd suggesting putting all the
target IDs into a temp table and joining to that.

            regards, tom lane