Thread: SQL Load Balancer for PostgreSQL
Here is a new project that some people might be interested in. This project now supports Oracle8i and PostgreSQL. There is a PHP module as well. SQL Load Balancer (SQLB) http://sqlb.sourceforge.net/ The SQLB project is used to improve SQL requests to a database. It provides programs that have permanent connections to a DB, a program that checks the integrity of transactions, and a library to link with programs that need to make SQL queries. Brent
> Here is a new project that some people might be interested > in. This project now supports Oracle8i and PostgreSQL. > There is a PHP module as well. > > SQL Load Balancer (SQLB) > http://sqlb.sourceforge.net/ > > The SQLB project is used to improve SQL requests to a > database. It provides programs that have permanent > connections to a DB, a program that checks the integrity of > transactions, and a library to link with programs that need > to make SQL queries. FYI, 7.2 will have a RESET ALL; command that is useful for passing persistent connections, and we use "BEGIN;COMMIT;" to cancel any open transaction. -- Bruce Momjian | http://candle.pha.pa.us pgman@candle.pha.pa.us | (610) 853-3000 + If your life is a hard drive, | 830 Blythe Avenue + Christ can be your backup. | Drexel Hill, Pennsylvania 19026 ---------------------------(end of broadcast)--------------------------- TIP 4: Don't 'kill -9' the postmaster
> Here is a new project that some people might be interested > in. This project now supports Oracle8i and PostgreSQL. > There is a PHP module as well. > > SQL Load Balancer (SQLB) > http://sqlb.sourceforge.net/ > > The SQLB project is used to improve SQL requests to a > database. It provides programs that have permanent > connections to a DB, a program that checks the integrity of > transactions, and a library to link with programs that need > to make SQL queries. FYI, 7.2 will have a RESET ALL; command that is useful for passing persistent connections, and we use "BEGIN;COMMIT;" to cancel any open transaction. -- Bruce Momjian | http://candle.pha.pa.us pgman@candle.pha.pa.us | (610) 853-3000 + If your life is a hard drive, | 830 Blythe Avenue + Christ can be your backup. | Drexel Hill, Pennsylvania 19026 ---------------------------(end of broadcast)--------------------------- TIP 4: Don't 'kill -9' the postmaster
New to the PHP/PostgreSQL combo. I have a couple of queries which take a few minutes to run and we are getting timeouts. Didn't find anything usefull on the archives. How are you handling long running queries on your PHP?
set_time_limit( <secs> ); At 11:48 AM 1/19/02 -0500, Francisco Reyes wrote: >New to the PHP/PostgreSQL combo. >I have a couple of queries which take a few minutes to run and we are >getting timeouts. Didn't find anything usefull on the archives. > >How are you handling long running queries on your PHP? > > >---------------------------(end of broadcast)--------------------------- >TIP 2: you can get off all lists at once with the unregister command > (send "unregister YourEmailAddressHere" to majordomo@postgresql.org) >
On Sat, 19 Jan 2002, Francisco Reyes wrote: > New to the PHP/PostgreSQL combo. > I have a couple of queries which take a few minutes to run and we are > getting timeouts. Didn't find anything usefull on the archives. > > How are you handling long running queries on your PHP? > > > ---------------------------(end of broadcast)--------------------------- > TIP 2: you can get off all lists at once with the unregister command > (send "unregister YourEmailAddressHere" to majordomo@postgresql.org) > You can always use persistent connections. It's not a good idea to use too many, or you will be denied access to Postgres any other way, but it's a quick fix for the short run. Long run and responsibility includes using an array, or an object to retrieve data. You can always split queries up into many small queries. We'll need a little more information to help you with that, though. Like what the code is...what your php.ini file and httpd.conf file says about connection timeouts... maybe more ??-- like what you've read on php.net's docs... When new programmers here where I work started using php/postgresql 6 months ago, they used pconnect in EVERY one of their scripts, filled up the server, and were unable to connect. No one was able to. They didn't know what they did, and it took a little reading to figure out that Apache needs to be restarted, NOT postgres. Don't kill -9 the postmaster. Regards, Chadwick Rolfs - cmr@gis.net Cleveland State University - Student Music Major - The Holden Arboretum Volunteer Computer Programmer - Student Employee --*I finally found powdered water; I just can't figure out what to add to it*--
http://www.php.net/manual/en/function.set-time-limit.php will allow you to set the timeout but you might post the queries and EXPLAIN output to pg-general to see if there is something that can be done to speed them up... Some well placed indexes always help! > You can always use persistent connections. It's not a good idea to use > too many, or you will be denied access to Postgres any other way, but it's > a quick fix for the short run. The maximum connections is a configuration parameter in postgresql.conf -- persistent connections are pretty cool but have been broken until recent versions of PHP so I'm still hesitant using them.. > Long run and responsibility includes using an array, or an object to > retrieve data. Huh? >You can always split queries up into many small queries. What makes you say that? -Mitch
Hi all > http://www.php.net/manual/en/function.set-time-limit.php will allow you to > set the timeout This only works if safe mode is off. If you have a couple of queries in one script you can divide them up on different pages. Put a redirect at the end of the query. On the new page the execution time starts new. Normally it is set to 30 seconds in the php.ini file. Or have a look on the function ini_set() string ini_set (string varname, string newvalue) (I have never tested this function;-) Greetings Conni
> If you have a couple of queries in one script you can divide them > up on different pages. I have a page that uses many queries to gather a bunch of stats that are saved for later display. It is not too important that they be noted at exactly the same time so every xxth time (or after a timestamp) 1 of them is updated. The next case to update is saved in the row. Adds a bit of overhead, but I don't have to worry as the db grows and can easily add more cases. Jeff
On Sat, 19 Jan 2002, Frank Bax wrote: > > set_time_limit( <secs> ); > > > At 11:48 AM 1/19/02 -0500, Francisco Reyes wrote: > >How are you handling long running queries on your PHP? I just read the explanation at the php.net site for that function and it doesn't seem to do what I would need: "Any time spent on activity that happens outside the execution of the script such as ......database queries, etc. is not included when determining the maximum time that the script has been running." I am going to try anyway to see if this helps.
On Sat, 19 Jan 2002, Chadwick Rolfs wrote: > On Sat, 19 Jan 2002, Francisco Reyes wrote: > > > New to the PHP/PostgreSQL combo. > > I have a couple of queries which take a few minutes to run and we are > > getting timeouts. Didn't find anything usefull on the archives. > > > > How are you handling long running queries on your PHP? > > > You can always use persistent connections. It's not a good idea to use > too many, or you will be denied access to Postgres any other way, but it's > a quick fix for the short run. > > Long run and responsibility includes using an array, or an object to > retrieve data. You can always split queries up into many small queries. > We'll need a little more information to help you with that, though. Like > what the code is...what your php.ini file and httpd.conf file says about > connection timeouts... maybe more ??-- like what you've read on php.net's > docs... Thanks for the info/pointers. The issue is basically a query that takes several minutes to run. I look/read about persistent connections and I don't think that has anything to do with the problem. I believe the timeout is probably at the web server. After someone suggested I looked at "set_time_limit" I found that such function doesn't account for DBconnection time. Afer reading that and your suggestion to read httpd.conf I am starting to think that the likely place of the timeout is the web server. Will play with the settings on httpd.conf I wonder how sites that display an update while the long query is running do it. I have seen sites that display something like "retrieving info.. please wait" and then right before the info comes up the waiting text dissapears. Is that Javascript? Flash? I can't imagine how that could be done with plain html.
On Sunday 20 January 2002 16:56, Francisco Reyes wrote: > On Sat, 19 Jan 2002, Chadwick Rolfs wrote: > > On Sat, 19 Jan 2002, Francisco Reyes wrote: > > > New to the PHP/PostgreSQL combo. > > > I have a couple of queries which take a few minutes to run and we are > > > getting timeouts. Didn't find anything usefull on the archives. > > > > > > How are you handling long running queries on your PHP? [...] > After someone suggested I looked at "set_time_limit" I found that such > function doesn't account for DBconnection time. Afer reading that and your > suggestion to read httpd.conf I am starting to think that the likely place > of the timeout is the web server. > > Will play with the settings on httpd.conf > > I wonder how sites that display an update while the long query is running > do it. I have seen sites that display something like "retrieving info.. > please wait" and then right before the info comes up the waiting text > dissapears. Is that Javascript? Flash? I can't imagine how that could be > done with plain html. What timeout message do you get, exactly? I could be just the *client*, or the HTTP proxy, that times out if your script doesn't produce any output, and *not* the httpd process (thus, no need to look at httpd.conf). .TM.
On Sat, 19 Jan 2002, Mitch Vincent wrote: > http://www.php.net/manual/en/function.set-time-limit.php will allow you to > set the timeout but you might post the queries and EXPLAIN output to > pg-general to see if there is something that can be done to speed them up... > Some well placed indexes always help! > > > You can always use persistent connections. It's not a good idea to use > > too many, or you will be denied access to Postgres any other way, but it's > > a quick fix for the short run. > > The maximum connections is a configuration parameter in postgresql.conf -- > persistent connections are pretty cool but have been broken until recent > versions of PHP so I'm still hesitant using them.. > It's good to see someone else sees problems with persistent connections. You could set your maximum connections to infinite, but would that really do any good? It seems the server might easily run out of memory if there isn't a cap on connection numbers,,,, as well as connection time. > > Long run and responsibility includes using an array, or an object to > > retrieve data. > > Huh? I can see why this doesn't make much sense. Seeing as we have nothing to work with from the original post regarding just what the query is.. I'll try to give an example to explain. querying all the names in a phone book DB: select first_name, last_name from white_pages; --UH... this will be,,,let's say 150,000 entries (small city phone book) WELL, you could select by starting letters, and put it into an array, using pg_fetch_array, or pg_fetch_object. Make multiple queries in the same script, and use that array (or object) to ouput to the browser... select first_name, last_name from white_pages where last_name like 'A%'; select first_name, last_name from white_pages where last_name like 'B%'; select first_name, last_name from white_pages where last_name like 'C%'; select first_name, last_name from white_pages where last_name like 'D%'; ...etc if it's still too long, try 'Aa%', 'Ab%' ... so on. Tedious, but efficient. I know this is only the sql, there is php involved. We can save that for the php list, or, if there are any further questions. Hope this clarifies at least a little bit. ?? > > >You can always split queries up into many small queries. > > What makes you say that? if you have your rdbms set up right, you have a unique number (primary key) relating to each row. So using the sql syntax above, you could select one row at a time, if you needed to, using a for loop and sending single queries to the backend. You shouldn't be flooding a webbrowser with over 30 seconds of data anyway, as you run the risk of a crash. > > -Mitch > > > > ---------------------------(end of broadcast)--------------------------- > TIP 6: Have you searched our list archives? > > http://archives.postgresql.org > Chadwick Rolfs - cmr@gis.net Cleveland State University - Student Music Major - The Holden Arboretum Volunteer Computer Programmer - Student Employee --*I finally found powdered water; I just can't figure out what to add to it*--
> I can see why this doesn't make much sense. Seeing as we have nothing to > work with from the original post regarding just what the query is.. I'll > try to give an example to explain. > > querying all the names in a phone book DB: > select first_name, last_name from white_pages; > --UH... this will be,,,let's say 150,000 entries (small city phone book) > > WELL, you could select by starting letters, and put it into an array, > using pg_fetch_array, or pg_fetch_object. Make multiple queries in the > same script, and use that array (or object) to ouput to the browser... > > select first_name, last_name from white_pages where last_name like 'A%'; > select first_name, last_name from white_pages where last_name like 'B%'; > select first_name, last_name from white_pages where last_name like 'C%'; > select first_name, last_name from white_pages where last_name like 'D%'; > ...etc > > if it's still too long, try 'Aa%', 'Ab%' ... so on. Tedious, but > efficient. You could always just use CURSORs. Look up 'DECLARE' and 'MOVE' and 'FETCH' sql commands in postgres. Chris