I had already tried setting the statistics to 1000 for all five of these
double precision fields with effectively no improvement. Should have
mentioned that.
Also the between makes all values for diagonalSize bad since it is
effectively doing digonalSize > X and diagonalSize < Y. If I do a query
with the same values for the four x,y values and diagonalSize < Y, then
for Y=49.999 the query is fast but for anything 50.000 and greater the
query is slow. The exact opposite of the greater than queries not
surprisingly.
I also think I originally reported that the two queries gave the same
number of rows. That is not true. It was true when I had other joins
in, but when I stripped the query down to this core problem I should
have noticed that the number of results now differs between the two,
which I didn't at first.
If I take away the diagonalSize condition in my query I find that there
are 225 rows that satisfy the other conditions. 155 of these have a
diagonalSize value of exactly 50.000, while the remaining 70 rows all
have values larger than 50. Thus there is a big discrete jump in the
number of rows at a diagonalSize of 50. However, the statistics are off
by two orders of magnitude in guessing how many rows there are going to
be in this case and thus is not using my index. How can I fix that?
Ken
On Wed, 2004-06-09 at 14:29, Rod Taylor wrote:
> > ... and here is the plan with statistics set to 1000 ...
> >
> > Seq Scan on nrgfeature f (cost=0.00..31675.57 rows=18608 width=218)
> > (actual time=63.544..1002.701 rows=225 loops=1)
> > Filter: ((upperrightx > 321264.236977215::double precision) AND
> > (lowerleftx < 324046.799812083::double precision) AND (upperrighty >
> > 123286.261898636::double precision) AND (lowerlefty <
> > 124985.927450476::double precision) AND (diagonalsize > 49.999::double
> > precision))
>
> It's better like this, but still way off the mark. Even your good query
> which uses the index was out by more than an order of magnitude.
>
> Try raising the statistics levels for upperrightx, lowerleftx,
> upperrighty and lowerlefty.
>
> Failing that, you might be able to push it back down again by giving
> diagonalsize an upper limit. Perhaps 500 is a value that would never
> occur.
>
> AND (diagonalsize BETWEEN 49.999::double precision AND 500)
>
>
>