[I hope job postings are kosher...]
I need help optimizing a PostgreSQL application:
Full-text search
~17,000 records
Articles (text) are about 10K long on average, ranging from 0 to 278K.
I don't know if we need to throw more RAM, more hard drive, more
comparison RAM in postmaster.conf or build a concordance or if this is
just not something that can be done within our budget.
I can't even seem to get the PostgreSQL profiling output using "-s" in the
startup of postmaster and client to determine what the db engine is doing.
I don't understand why PostgreSQL sometimes chooses not to use the
existing INDEXes to do an index scan instead of sequential scan -- Does it
really think sequential will be faster, or does it eliminate an index scan
because there won't be enough hard drive or swap space to do it?
Currently, full text search queries take on the order of 2 minutes to
execute.
We need them to be happening in 5 seconds, if at all possible.
Unfortunately, this needs to happen EARLY THIS WEEK, if at all possible.
Contact me off-list with some idea of price/availability/references if you
are interested in taking on this task.
THANKS!