Re: 100 simultaneous connections, critical limit? - Mailing list pgsql-performance

From Nick Barr
Subject Re: 100 simultaneous connections, critical limit?
Date
Msg-id 8F4A22E017460A458DB7BBAB65CA6AE502AA45@openmanage
Whole thread Raw
In response to 100 simultaneous connections, critical limit?  (Jón Ragnarsson <jonr@physicallink.com>)
List pgsql-performance
> -----Original Message-----
> From: pgsql-performance-owner@postgresql.org
[mailto:pgsql-performance-
> owner@postgresql.org] On Behalf Of Jón Ragnarsson
> Sent: 14 January 2004 13:44
> Cc: pgsql-performance@postgresql.org
> Subject: Re: [PERFORM] 100 simultaneous connections, critical limit?
>
> Ok, connection pooling was the thing that I thought of first, but I
> haven't found any docs regarding pooling with PHP+Postgres.
> OTOH, I designed the application to be as independent from the DB as
> possible. (No stored procedures or other Postgres specific stuff)
> Thanks,
> J.

As far as I know PHP supports persistent connections to a PG database.
See pg_pconnect instead of pg_connect. Each of the db connections are
tied to a particular Apache process and will stay open for the life of
that process. So basically make sure your Apache config file
(httpd.conf) and PG config file (postgresql.conf) agree on the maximum
number of connections otherwise some pages will not be able to connect
to your database.

This may not be a problem for small sites but on large sites it is, with
heavy loads and large number of concurrent users. For example, consider
a site that must support 500 concurrent connections. If persistent
connections are used at least 500 concurrent connections to PG would be
required, which I guess is probably not recommended.

The way I would like Apache/PHP to work is to have a global pool of
connections to a postgres server, which can be shared around all Apache
processes. This pool can be limited to say 50 or 100 connections.
Problems occur under peak load where all 500 concurrent connections are
in use, but all that should happen is there is a bit of a delay.

Hope that (almost) makes sense,


Kind Regards,

Nick Barr
WebBased Ltd.


> Christopher Browne wrote:
>
> > Clinging to sanity, jonr@physicallink.com (Jón Ragnarsson) mumbled
into
> her beard:
> >
> >>I am writing a website that will probably have some traffic.
> >>Right now I wrap every .php page in pg_connect() and pg_close().
> >>Then I read somewhere that Postgres only supports 100 simultaneous
> >>connections (default). Is that a limitation? Should I use some other
> >>method when writing code for high-traffic website?
> >
> >
> > I thought the out-of-the-box default was 32.
> >
> > If you honestly need a LOT of connections, you can configure the
> > database to support more.  I "upped the limit" on one system to have
> > 512 the other week; certainly supportable, if you have the RAM for
it.
> >
> > It is, however, quite likely that the connect()/close() cuts down on
> > the efficiency of your application.  If PHP supports some form of
> > "connection pooling," you should consider using that, as it will cut
> > down _dramatically_ on the amount of work done establishing/closing
> > connections, and should let your apps use somewhat fewer connections
> > more effectively.
>
>
> ---------------------------(end of
broadcast)---------------------------
> TIP 8: explain analyze is your friend



pgsql-performance by date:

Previous
From: Jón Ragnarsson
Date:
Subject: Re: 100 simultaneous connections, critical limit?
Next
From: Tom Lane
Date:
Subject: Re: 100 simultaneous connections, critical limit?