Hamid Khoshnevis wrote:
> In a web environment, a user can repeatedly starts and abort large queries
> which create a large number of zombie postmasters. Is there a way to
> control the postmaster zombies so that they don't clog up the system
> resources.
I am assuming that by "web environment" you are speaking of utilizing a
PostgreSQL
database via a CGI, Servlet, etc... If this is the case it is usually easiest
to make sure that your
program gracefully handles a client disconnect by cleanly disconnecting from
the database. This
usually involves a little more code, and strict error checking on the part of
your web application,
but it will reduce headaches in the long run.
Depending on the programming environment you are using you may be able to
employ a
technique such as connection pooling/handling to seperate the actual database
access from the program
handling IO. This way even if the client disconnects in the middle of a large
query the
connection manager can still cleanly disconnect from the database or clean up
from the offending statement..
Utilizing a connection handler offers other benefits as well; namely it can
eliminate the need to open
a new database connection for every CGI process or Servlet thread. This can
reduce overhead by
making it unneccesary for Postgres to start a new process for yet another
client, and hence authenticating
another user.
~Stephen
==
Stephen J Lombardo
Web/Database Application Developer
Montclair State University
==