Hi guys,
Peter is definitely not a newby on this list, so i'm sure he already
thought about some kind of pooling if applicable... but then I'm
dead-curious what kind of application could possibly rule out connection
pooling even if it means so many open connections ? Please give us some
light Peter...
Cheers,
Csaba.
On Mon, 2004-08-16 at 15:53, Michal Taborsky wrote:
> Peter Eisentraut wrote:
> > Is there any practical limit on the number of parallel connections that a
> > PostgreSQL server can service? We're in the process of setting up a system
> > that will require up to 10000 connections open in parallel. The query load
> > is not the problem, but we're wondering about the number of connections.
> > Does anyone have experience with these kinds of numbers?
>
> No experience, but a little thinking and elementary school math tells
> me, that you'd need huge amount of RAM to support 10000 connections,
> since postgres is multi-process. Our typical postgres process eats 5-40
> megs of memory, depending on activity. So even if it was just 5 megs,
> with 10k connections we are talking about 50G of RAM. If these
> connections are idle, it would be plain waste of resources.
>
> I suggest you look into some sort of connection pooling.