Overload - Mailing list pgsql-general

From Stefan Krompass
Subject Overload
Date
Msg-id 425FDEF8.6060503@yahoo.com.hk
Whole thread Raw
Responses Re: Overload  (peter pilsl <pilsl@goldfisch.at>)
List pgsql-general
Hi,

  I'd like to implement a system to prevent a PostgreSQL database from
being overloaded by delaying queries when the database is already highly
loaded. I.e. the the sum of the execution costs of queries currently in
the database is already near a certain threshold and executing the
"next" query would cause the execution costs to pass this threshold.
  Limiting the number of queries concurrently in the database to a fixed
number n is out of question since - in my opinion - n simple
   SELECT c FROM t WHERE c="..."
  would generally produce a much lower workload than n complex queries.
So, the goal is some more dynamic approach.
  But my problem is to measure the execution costs of a query. My first
thought was to use the estimates of the optimizer but these estimates
only give the time needed to execute the query.
  I know that the term "execution costs" is somewhat imprecise. Ideally,
the value for the execution costs is a value that "merges" the I/O and
the CPU usage used by the query (to be more precise: estimates about the
I/O and CPU usage for the query). I've read the developer manuals but I
didn't find any information on this. Does PostgreSQL offer information
on the additional workload (execution costs) caused by a query? In case
it does not: Does anybody have an idea how I get an estimate for the
execution costs before executing a query?

Thanks in advance

Stefan

pgsql-general by date:

Previous
From: Tony Caduto
Date:
Subject: Announcment: PG Lighting Admin Pre-Release 6 available
Next
From: peter pilsl
Date:
Subject: Re: Overload