I have a 50 MB +- table in postgres. The data is normalized so there's
not much I can do about the size. The tuples are about 512 bytes so
there's a pile of 'em. I need searching on of several fields, a couple
in particular are text fields that needs 'LIKE'. The problem is, the
thing is way too slow. So, I was wondering, before I go hunting for some
other solution, could anyone here point me to some ways to (hand)
optimize the searching in postgres? Different indexes, hashing and LIKE?
I'm not sure where to go with this.
The basic criteria are:
- sizes of indexes, etc, is not an issue. There's lot's of room on the
box.
- the data is basically static so a read-only (if such a thing) is
fine.
- it needs to be FAST
cheers
jb