Thread: Ran out of connections

Ran out of connections

From
Mike Diehl
Date:
Hi all.

Twice this week, I've come to work to find my Postgres server out of 
connections... effectively freezing my web server.

Today, before I rebooted the entire system, I did a ps -auxw and kept the 
file to study.  I didn't find too many clients running.  But I did find a 
whole LOT of postgres processes running, idle.  BTW, one of the postgres 
processes was doing a vacuum analyze.  I'm running 7.2.

Can anyone tell me how to fix this?  The out put of the ps command can be 
seen at http://dominion.dyndns.org/~mdiehl/ps.txt

Thanx in advance,
-- 
Mike Diehl
Network Tools Devl.
SAIC at Sandia Labs
(505) 284-3137


Re: Ran out of connections

From
Roberto Mello
Date:
On Wed, Dec 04, 2002 at 03:08:35PM -0700, Mike Diehl wrote:
> Hi all.
> 
> Twice this week, I've come to work to find my Postgres server out of 
> connections... effectively freezing my web server.
> 
> Today, before I rebooted the entire system, I did a ps -auxw and kept the 
> file to study.  I didn't find too many clients running.  But I did find a 
> whole LOT of postgres processes running, idle.  BTW, one of the postgres 
> processes was doing a vacuum analyze.  I'm running 7.2.
> 
> Can anyone tell me how to fix this?  The out put of the ps command can be 
> seen at http://dominion.dyndns.org/~mdiehl/ps.txt

Are you using PHP by chance? I've seen this behavior under Apache+PHP
before. My "fix" (workaround rather) was to disable persistent
connections.

-Roberto

-- 
+----|        Roberto Mello   -    http://www.brasileiro.net/  |------+
+       Computer Science Graduate Student, Utah State University      +
+       USU Free Software & GNU/Linux Club - http://fslc.usu.edu/     +
:) :D :O :( :[ ;) 8) B) :> |I :P =) :S :B :] :\


Re: Ran out of connections

From
Steve Crawford
Date:
You probably didn't need to reboot - I suspect you could have probably 
restarted PostgreSQL and Apache (quick version) or killed the extra postgres 
processes.

I suspect you need to look carefully at your code and method of connecting 
(ie. are you using mod-perl, plain old cgi perl, PHP or what). A problem with 
Apache 1.x is that of connection pooling/persistence. If you don't use 
persistent connections you suffer the overhead of opening and closing the 
database connection as required by your web process. You can use mod-perl and 
keep your connections open but if your pool of web processes that have open 
connections exceeds the number of connections allowed by PostgreSQL you will 
have problems (pooling of database connections between processes is 
problematic so each Apache process has its own connection).

Be sure that there isn't a bug causing a cgi to abort leaving a stranded 
connection. I don't have experience with local unix socket connections where 
the client has died but when I have windoze users reboot when they had an 
open connection it will wait till the TCP/IP connection times out (frequently 
for an hour) before the PostgreSQL connection is closed. If the timeout is 
similar for local unix sockets then a failure in the cgi could leave open 
connections and you will run out quickly.

If you are doing lots of database backed work you may want to check out 
AOLserver (http://www.aolserver.com/). It has a multi-threaded architecture 
featuring connection pooling and persistence "out of the box." Oh, it's 
free/open-source as well.

Of course you can also get pooling/persistence with enterprise Java solutions 
such as JBoss (www.jboss.org).

Cheers,
Steve


On Wednesday 04 December 2002 2:08 pm, Mike Diehl wrote:
> Hi all.
>
> Twice this week, I've come to work to find my Postgres server out of
> connections... effectively freezing my web server.
>
> Today, before I rebooted the entire system, I did a ps -auxw and kept the
> file to study.  I didn't find too many clients running.  But I did find a
> whole LOT of postgres processes running, idle.  BTW, one of the postgres
> processes was doing a vacuum analyze.  I'm running 7.2.
>
> Can anyone tell me how to fix this?  The out put of the ps command can be
> seen at http://dominion.dyndns.org/~mdiehl/ps.txt
>
> Thanx in advance,


Re: Ran out of connections

From
Robert Treat
Date:
Once your done scoping other things out, you might also want to look at
increasing the number of allowed connections (in postgresql.conf). The
defaults can be low for high traffic systems.

Robert Treat

On Wed, 2002-12-04 at 17:29, Steve Crawford wrote:
> You probably didn't need to reboot - I suspect you could have probably 
> restarted PostgreSQL and Apache (quick version) or killed the extra postgres 
> processes.
> 
<snip lots of good suggestions>
> 
> Cheers,
> Steve
> 
> 
> On Wednesday 04 December 2002 2:08 pm, Mike Diehl wrote:
> > Hi all.
> >
> > Twice this week, I've come to work to find my Postgres server out of
> > connections... effectively freezing my web server.
> >
<snip>
> >
> > Can anyone tell me how to fix this?  The out put of the ps command can be
> > seen at http://dominion.dyndns.org/~mdiehl/ps.txt
> >
> > Thanx in advance,
> 





Re: Ran out of connections

From
Mike Diehl
Date:
On Wednesday 04 December 2002 03:25 pm, Roberto Mello wrote:    > On Wed, Dec 04, 2002 at 03:08:35PM -0700, Mike Diehl
wrote:   > > Can anyone tell me how to fix this?  The out put of the ps command    > > can be seen at
http://dominion.dyndns.org/~mdiehl/ps.txt   >    > Are you using PHP by chance? I've seen this behavior under
Apache+PHP   > before. My "fix" (workaround rather) was to disable persistent    > connections.
 

Nope.  I'm using Perl and cgi.  I've got some perl that runs via cron, and 
some more that runs via apache.  I'm not even using ModPerl.

It did occur to me that since some of my scripts communicate with other 
devices, that I may have some IO blocking, or zombies, but the ps output 
didn't indicate that.  I can't see that many scripts running.  Usually, I see 
one postgres process for each script/cgi that is running.  Not in this case.

-- 
Mike Diehl
Network Tools Devl.
SAIC at Sandia Labs
(505) 284-3137


Re: Ran out of connections

From
Roberto Mello
Date:
On Wed, Dec 04, 2002 at 02:29:25PM -0800, Steve Crawford wrote:
> 
> If you are doing lots of database backed work you may want to check out 
> AOLserver (http://www.aolserver.com/). It has a multi-threaded architecture 
> featuring connection pooling and persistence "out of the box." Oh, it's 
> free/open-source as well.

I second that suggestion. Having used AOLserver for the past few years,
it's a very nice application/web server with superb database support.

-Roberto

-- 
+----|        Roberto Mello   -    http://www.brasileiro.net/  |------+
+       Computer Science Graduate Student, Utah State University      +
+       USU Free Software & GNU/Linux Club - http://fslc.usu.edu/     +
Itsdifficulttobeverycreativewithonlyfiftysevencharacters!


Re: Ran out of connections

From
Steve Crawford
Date:
Doing anything unusual? Forking processes, opening multiple connections 
within a single CGI?

Have you seen any evidence that a process that opens a connection is failing 
to complete normally?

-Steve


On Wednesday 04 December 2002 3:52 pm, Mike Diehl wrote:
> On Wednesday 04 December 2002 03:25 pm, Roberto Mello wrote:
>      > On Wed, Dec 04, 2002 at 03:08:35PM -0700, Mike Diehl wrote:
>      > > Can anyone tell me how to fix this?  The out put of the ps command
>      > > can be seen at http://dominion.dyndns.org/~mdiehl/ps.txt
>      >
>      > Are you using PHP by chance? I've seen this behavior under
>      > Apache+PHP before. My "fix" (workaround rather) was to disable
>      > persistent connections.
>
> Nope.  I'm using Perl and cgi.  I've got some perl that runs via cron, and
> some more that runs via apache.  I'm not even using ModPerl.
>
> It did occur to me that since some of my scripts communicate with other
> devices, that I may have some IO blocking, or zombies, but the ps output
> didn't indicate that.  I can't see that many scripts running.  Usually, I
> see one postgres process for each script/cgi that is running.  Not in this
> case.