I've a query on a large table. The table consists of approx. 100.000
entries and the where-clause checks 4 chars against a 100-char-text in
the table. So this is hard work and the query takes about 4 seconds on
my system. This is quite ok, but the problem is, I want to prepare the
result for human readers and therefore split in several pages. So I
first need to query once to get the number of results and based on
this number I create a navigation-bar and construct a limit-operator.
With this limit-operator I query a second time to get and display the
entries.
In fact I ask postgres the same query two times (difference is only
the limit-section) and this takes 8 instead of 4 seconds.
Any way to do better ?
thnx,
peter
--
mag. peter pilsl
phone: +43 676 3574035
fax : +43 676 3546512
email: pilsl@goldfisch.at
sms : pilsl@max.mail.at
pgp-key available