Thread: Terminate query on page exit
Greetings, I use postgresql on a Linux server on a virtual machine (despite my protest to IT personal). The client is typically a web server implementing PHP sites. Customers many times close a page typically when a heavy report runs for too long. Using the command "top", I got the impression that the query still runs even though the page was closed. I want to make sure the query is dropped when the customer closes the page. Any method to kill such query would it still hang out there? Thanks David Harel QIS LTD
david harel wrote: > I use postgresql on a Linux server on a virtual machine (despite my prote= st to IT personal). > The client is typically a web server implementing PHP sites. > > Customers many times close a page typically when a heavy report runs for = too long. > Using the command "top", I got the impression that the query still runs e= ven though the page was > closed. That is true. > I want to make sure the query is dropped when the customer closes the pag= e. > Any method to kill such query would it still hang out there? One crude method would be to set statement_timeout to a nonzero value - then queries that take longer than that many seconds will be canceled. Another option would be to use asynchronous query processing. The challenge is of course to regularly poll the query status, but if you can detect that the page is closed you could send a cancel request (pg_cancel_query in PHP). Yours, Laurenz Albe
On Tuesday, February 19, 2013, Albe Laurenz wrote: > david harel wrote: > > I use postgresql on a Linux server on a virtual machine (despite my > protest to IT personal). > > The client is typically a web server implementing PHP sites. > > > > Customers many times close a page typically when a heavy report runs for > too long. > > Using the command "top", I got the impression that the query still runs > even though the page was > > closed. > > That is true. > > > I want to make sure the query is dropped when the customer closes the > page. > > Any method to kill such query would it still hang out there? > > One crude method would be to set statement_timeout to a nonzero > value - then queries that take longer than that many seconds > will be canceled. you don't truly mean to advise that, do you? :) --patrick
2013/2/19 patrick keshishian <pkeshish@gmail.com>: > On Tuesday, February 19, 2013, Albe Laurenz wrote: >> >> david harel wrote: >> > I use postgresql on a Linux server on a virtual machine (despite my >> > protest to IT personal). >> > The client is typically a web server implementing PHP sites. >> > >> > Customers many times close a page typically when a heavy report runs for >> > too long. >> > Using the command "top", I got the impression that the query still runs >> > even though the page was >> > closed. >> >> That is true. >> >> > I want to make sure the query is dropped when the customer closes the >> > page. >> > Any method to kill such query would it still hang out there? >> >> One crude method would be to set statement_timeout to a nonzero >> value - then queries that take longer than that many seconds >> will be canceled. > > > you don't truly mean to advise that, do you? :) it is not bad advice - usually all long queries should be cancelled by timeout - and timeout is the most simple and sometimes good enough solution. You can set timeout just for account used for login from web application But there are not simple solution - complete solution needs AJAX, back calls, ... Regards Pavel Stehule > --patrick
patrick keshishian wrote: >>> Any method to kill such query would it still hang out there? >> >> One crude method would be to set statement_timeout to a nonzero >> value - then queries that take longer than that many seconds >> will be canceled. >=20 > you don't truly mean to advise that, do you? :) Well, it might be useful as a kind of safeguard against runaway queries that hog the server for hours. If you expect that no reasonable query will take more than a couple of minutes, you could set statement_timeout=3D600 to kill anything that's still alive after 10 minutes. Yours, Laurenz Albe
On Tue, Feb 19, 2013 at 4:38 AM, Pavel Stehule <pavel.stehule@gmail.com> wrote: > 2013/2/19 patrick keshishian <pkeshish@gmail.com>: >> On Tuesday, February 19, 2013, Albe Laurenz wrote: >>> >>> >>> One crude method would be to set statement_timeout to a nonzero >>> value - then queries that take longer than that many seconds >>> will be canceled. >> >> >> you don't truly mean to advise that, do you? :) > > it is not bad advice - usually all long queries should be cancelled by > timeout - and timeout is the most simple and sometimes good enough > solution. You can set timeout just for account used for login from web > application It would be nice if a long running query could occasionally check to see if it still has somewhere to send the results it is computing. Rather than running for hours only to give a "could not send data to client: Broken pipe" as soon as the first row becomes available. client_alive_timeout? Cheers, Jeff
2013/2/19 Jeff Janes <jeff.janes@gmail.com>: > On Tue, Feb 19, 2013 at 4:38 AM, Pavel Stehule <pavel.stehule@gmail.com> wrote: >> 2013/2/19 patrick keshishian <pkeshish@gmail.com>: >>> On Tuesday, February 19, 2013, Albe Laurenz wrote: >>>> >>>> >>>> One crude method would be to set statement_timeout to a nonzero >>>> value - then queries that take longer than that many seconds >>>> will be canceled. >>> >>> >>> you don't truly mean to advise that, do you? :) >> >> it is not bad advice - usually all long queries should be cancelled by >> timeout - and timeout is the most simple and sometimes good enough >> solution. You can set timeout just for account used for login from web >> application > > It would be nice if a long running query could occasionally check to > see if it still has somewhere to send the results it is computing. > Rather than running for hours only to give a "could not send data to > client: Broken pipe" as soon as the first row becomes available. > client_alive_timeout? it is not bad idea Regards Pavel > > Cheers, > > Jeff