We are building a postgresql based backend database for our 'hosting
provisioning' system. In a vain attempt to add some, what I thought, simple
performance tweaks, I thought I would try putting some of the larger and
more straighforward queries into functions. For everything else the same,
the functions are on the whole slower. Should they be ?
The whole thing is being driver through perl DBI. This may be contributory.
Anyhow, the original query:
SELECT COUNT(mb.instance) FROM domain dm, mail ms, mailbox mb WHERE dm.enabled = true and dm.existent = true and
dm.available= true AND ms.enabled = true and ms.existent = true and ms.available = true AND mb.enabled =
trueand mb.existent = true and mb.available = true AND dm.parent = ms.parent AND mb.parent = ms.serviceid
AND dm.instance = $q_domain AND mb.instance = $q_local_part;
where $q_XXX are quoted perl scalars. The function is then:
CREATE FUNCTION mail_is_mailbox(text, text) RETURNS int4 AS '
SELECT COUNT(mb.instance) FROM domain dm, mail ms, mailbox mb WHERE dm.enabled = true and dm.existent = true and
dm.available= true AND ms.enabled = true and ms.existent = true and ms.available = true AND mb.enabled =
trueand mb.existent = true and mb.available = true AND dm.parent = ms.parent AND mb.parent = ms.serviceid
AND dm.instance = $2 AND mb.instance = $1;'
LANGUAGE 'sql';
SELECT mail_is_mailbox($q_local_part, $q_domain);
Running both these 1000 times from a remote (same subnet 100BaseTX) client
with the same query results in time for the function typically 20 - 25% more
than the bare query. 22 vs 16 seconds for example.
I would have thought that not sending the long SQL across the wire 1000
times would have saved some time even without any potential query
optimisations by pre-parsing the SQL ?
rgds,
--
Peter Galbavy
Knowledge Matters Ltd.
http://www.knowledge.com/