Hi,
Shouldn't the optimizer use indices if the like condition does not have any
wildcards?
An example:
girgen=# explain analyze select * from person where userid = 'girgen';
QUERY PLAN
---------------------------------------------------------------------------
------------------------------------------
Index Scan using person_pkey on person (cost=0.00..5.98 rows=1 width=84)
(actual time=0.034..0.039 rows=1 loops=1)
Index Cond: (userid = 'girgen'::text)
Total runtime: 0.091 ms
(3 rader)
girgen=# explain analyze select * from person where userid like 'girgen';
QUERY PLAN
---------------------------------------------------------------------------
-----------------------
Seq Scan on person (cost=0.00..77.08 rows=1 width=84) (actual
time=1.137..1.143 rows=1 loops=1)
Filter: (userid ~~ 'girgen'::text)
Total runtime: 1.193 ms
(3 rader)
The result cannot be different between the two cases. The second query does
not use the index since database is initiaized with a locale,
sv_SE.ISO8859-1, and I need it for correct sorting. (Still dreaming about
indices with like and locale)... But, since there is no wildcard in the
string 'girgen', it should easily be able to use the index, if it only
bothered to note that there is a wildcard around, right?
Another thing on the same subject:
I use an app that builds searches using some standard method, and it wants
to always search case-insensitive. Hence, it uses ILIKE instead of `=',
even for joins, and even for integers. This is a bit lazy, indeed, and also
wrong. While this is wrong, no doubt, the odd thing I realized was that
the optimizer didn't make use of the indices. Same thing here, the
optimizer should ideally know that it is dealing with integers, where ILIKE
and LIKE has no meaning, and it should use `=' instead implicitally, hence
using indices. This one might be kind of low priority, but the one above
really isn't, IMO.
/Palle