Thread: Is there a way to limit CPU usage per user

Is there a way to limit CPU usage per user

From
Luki Rustianto
Date:
Hi All,

Is there a way to limit user's CPU resource specially on "SELECT" query ?

I hava a table with a lot of rows inside, if one sloopy DB users do a
"SELECT * FROM bigtable"
then CPU resource will go near 99% and this action will surely affect
the other database performance ...

Thanks.

Re: Is there a way to limit CPU usage per user

From
Richard Huxton
Date:
Luki Rustianto wrote:
> Hi All,
>
> Is there a way to limit user's CPU resource specially on "SELECT" query ?
>
> I hava a table with a lot of rows inside, if one sloopy DB users do a
> "SELECT * FROM bigtable"
> then CPU resource will go near 99% and this action will surely affect
> the other database performance ...

Not really - this is something for the operating-system, but most don't
provide sophisticated ways of managing processes running under one user
(postgres in this case).

If there are 5 queries going on, each should end up getting about 20% of
the resources anyway, if your OS is balancing things properly.

If a query needs a lot of CPU and you restrict it, then that query will
presumably run for longer and so affect more queries from other users.

--
   Richard Huxton
   Archonet Ltd

Re: Is there a way to limit CPU usage per user

From
Merlin Moncure
Date:
> Is there a way to limit user's CPU resource specially on "SELECT" query ?
>
> I hava a table with a lot of rows inside, if one sloopy DB users do a
> "SELECT * FROM bigtable"
> then CPU resource will go near 99% and this action will surely affect
> the other database performance ...
>
Yes, the answer is not to write the query in the first place :-).  you
can implement cursors, do client side browsing, or other techiniques
to handle this problem more elegantly.

try to follow rule of thumb to return minimal amount of data necessary
to the client.

Merlin

Re: Is there a way to limit CPU usage per user

From
Scott Marlowe
Date:
On Wed, 2006-02-08 at 15:42, Merlin Moncure wrote:
> > Is there a way to limit user's CPU resource specially on "SELECT" query ?
> >
> > I hava a table with a lot of rows inside, if one sloopy DB users do a
> > "SELECT * FROM bigtable"
> > then CPU resource will go near 99% and this action will surely affect
> > the other database performance ...
> >
> Yes, the answer is not to write the query in the first place :-).  you
> can implement cursors, do client side browsing, or other techiniques
> to handle this problem more elegantly.
>
> try to follow rule of thumb to return minimal amount of data necessary
> to the client.

Note that another useful tip here is to use slony to produce as many
replicants as needed to handle that kind of thing.

We have our production pgsql machines in a slony setup, with pg01 being
accessible by the application that inserts and updates the data, and all
reporting apps hit pg02 and up to do selects and such.

Re: Is there a way to limit CPU usage per user

From
Luki Rustianto
Date:
So how can we terminate such a long running query ?

The idea is to make a crontab to periodicaly do a job to search a
typical "SELECT * FROM bigtable" query who has run for some hours then
to terminate them...



On 2/9/06, Merlin Moncure <mmoncure@gmail.com> wrote:
> > Is there a way to limit user's CPU resource specially on "SELECT" query ?
> >
> > I hava a table with a lot of rows inside, if one sloopy DB users do a
> > "SELECT * FROM bigtable"
> > then CPU resource will go near 99% and this action will surely affect
> > the other database performance ...
> >
> Yes, the answer is not to write the query in the first place :-).  you
> can implement cursors, do client side browsing, or other techiniques
> to handle this problem more elegantly.
>
> try to follow rule of thumb to return minimal amount of data necessary
> to the client.
>
> Merlin
>

Re: Is there a way to limit CPU usage per user

From
Michael Fuhr
Date:
On Fri, Feb 10, 2006 at 11:30:04AM +0700, Luki Rustianto wrote:
> So how can we terminate such a long running query ?
>
> The idea is to make a crontab to periodicaly do a job to search a
> typical "SELECT * FROM bigtable" query who has run for some hours then
> to terminate them...

Are you familiar with the statement_timeout setting?

test=> SET statement_timeout TO 1000; -- milliseconds
SET
test=> SELECT <some long-running query>;
ERROR:  canceling statement due to statement timeout

If that won't work then please explain in general terms what problem
you're trying to solve, not how you're trying to solve it.

--
Michael Fuhr

Re: Is there a way to limit CPU usage per user

From
Tony Wasson
Date:
On 2/9/06, Michael Fuhr <mike@fuhr.org> wrote:
> On Fri, Feb 10, 2006 at 11:30:04AM +0700, Luki Rustianto wrote:
> > So how can we terminate such a long running query ?
> >
> > The idea is to make a crontab to periodicaly do a job to search a
> > typical "SELECT * FROM bigtable" query who has run for some hours then
> > to terminate them...
>
> Are you familiar with the statement_timeout setting?
>
> test=> SET statement_timeout TO 1000; -- milliseconds
> SET
> test=> SELECT <some long-running query>;
> ERROR:  canceling statement due to statement timeout
>
> If that won't work then please explain in general terms what problem
> you're trying to solve, not how you're trying to solve it.

I am also interested in being able to terminate *certain* long running
queries. I didn't want to use statement_timeout because there are some
queries that must run for a long time - in our case some queries that
create summary tables. Other long running queries should be killed. I
was able to get more granular by creating a "kill_pid" function in an
untrusted language and selectively kill ad-hoc queries. I'd suggest
having your non-killable queries run as one user.

That way you can do something like

SELECT * FROM pg_stat_activity
WHERE usename !='some_special_user'
AND query_start < NOW()-INTERVAL '30 minutes';

And then if you were very brave - you could kill those queries off.

I may get flamed for this, but this is how I have killed errant
processes.   I suspect you should pause for a long time before try to
install plperlu.

CREATE FUNCTION kill_pid(INTEGER) RETURNS TEXT AS
$BODY$
  my ($pid) = @_;
  my $out=system("kill -TERM $pid");
  return $out;
$BODY$ language plperlu;

REVOKE ALL ON FUNCTION kill_pid(INTEGER) FROM public;

Re: Is there a way to limit CPU usage per user

From
"Dave Page"
Date:

> -----Original Message-----
> From: pgsql-general-owner@postgresql.org
> [mailto:pgsql-general-owner@postgresql.org] On Behalf Of Tony Wasson
> Sent: 10 February 2006 15:31
> To: Michael Fuhr
> Cc: Luki Rustianto; Merlin Moncure; pgsql-general@postgresql.org
> Subject: Re: [GENERAL] Is there a way to limit CPU usage per user
>
> I may get flamed for this, but this is how I have killed errant
> processes.   I suspect you should pause for a long time before try to
> install plperlu.
>
> CREATE FUNCTION kill_pid(INTEGER) RETURNS TEXT AS
> $BODY$
>   my ($pid) = @_;
>   my $out=system("kill -TERM $pid");
>   return $out;
> $BODY$ language plperlu;
>
> REVOKE ALL ON FUNCTION kill_pid(INTEGER) FROM public;

Why not use pg_cancel_backend(pid) - (aside from it sending a SIGINT
rather than TERM)?

Regards, Dave.

Re: Is there a way to limit CPU usage per user

From
Scott Marlowe
Date:
On Fri, 2006-02-10 at 09:31, Tony Wasson wrote:
> On 2/9/06, Michael Fuhr <mike@fuhr.org> wrote:
> > On Fri, Feb 10, 2006 at 11:30:04AM +0700, Luki Rustianto wrote:
> > > So how can we terminate such a long running query ?
> > >
> > > The idea is to make a crontab to periodicaly do a job to search a
> > > typical "SELECT * FROM bigtable" query who has run for some hours then
> > > to terminate them...
> >
> > Are you familiar with the statement_timeout setting?
> >
> > test=> SET statement_timeout TO 1000; -- milliseconds
> > SET
> > test=> SELECT <some long-running query>;
> > ERROR:  canceling statement due to statement timeout
> >
> > If that won't work then please explain in general terms what problem
> > you're trying to solve, not how you're trying to solve it.
>
> I am also interested in being able to terminate *certain* long running
> queries. I didn't want to use statement_timeout because there are some
> queries that must run for a long time - in our case some queries that
> create summary tables. Other long running queries should be killed. I
> was able to get more granular by creating a "kill_pid" function in an
> untrusted language and selectively kill ad-hoc queries. I'd suggest
> having your non-killable queries run as one user.

Could setting a global statement_timeout of say 30000 milliseconds and
then having known long running queries set a different statement_timeout
on their own connections before running their query work?

Re: Is there a way to limit CPU usage per user

From
Tom Lane
Date:
Tony Wasson <ajwasson@gmail.com> writes:
> I am also interested in being able to terminate *certain* long running
> queries. I didn't want to use statement_timeout because there are some
> queries that must run for a long time - in our case some queries that
> create summary tables. Other long running queries should be killed.

You do know that statement_timeout can be changed freely via SET,
right?  One way to attack this would be for the clients that are
issuing known long-running queries to do "SET statement_timeout"
to boost up their allowed runtime.

            regards, tom lane

Re: Is there a way to limit CPU usage per user

From
Csaba Nagy
Date:
> You do know that statement_timeout can be changed freely via SET,
> right?  One way to attack this would be for the clients that are
> issuing known long-running queries to do "SET statement_timeout"
> to boost up their allowed runtime.

How does this apply to autovacuum's long running vacuum commands ? Cause
I have one table where the last vacuum took 15 hours :-)

It's true that now after migration to 8.1 I do the vacuuming manually
for that one and disabled it for autovacuum, but I still could have
other tables which would vacuum in more than 5 minutes (that would be my
statement_timeout for ordinary processes).

Cheers,
Csaba.



Re: Is there a way to limit CPU usage per user

From
Tom Lane
Date:
Csaba Nagy <nagy@ecircle-ag.com> writes:
> How does this apply to autovacuum's long running vacuum commands ? Cause
> I have one table where the last vacuum took 15 hours :-)

[ Checks code... ]  No, statement_timeout only applies to commands
arriving from an interactive client.  This *would* be a hazard for
the contrib version of autovacuum, but not for the 8.1 integrated
version.

(If you're using contrib autovacuum, it'd be a good idea to do
"ALTER USER SET statement_timeout = 0" for the user it runs as.)

            regards, tom lane