Thread: Server load planning

Server load planning

From
Dan Sugalski
Date:
I'm trying to get a handle on how an app I'm looking to roll out's
going to impact the server I'm connecting to, and what sort of
capacity planning's going to be needed to make it all work relatively
well.

I'm looking at around 250-300 simultaneous users, nearly all of them
doing interactive work. (Curses-based screen stuff for the most part)
I'd not too worried about the server we've got for them being able to
handle that except... for reasons that are fairly annoying, I'm
looking at somewhere in excess of 9K simultaneous connections to the
database server, and I'm not in a position to cut that down any. (The
app suite's written in an old 4GL that assumes an ISAM database.
We're porting to a modern database and runtime, but we have to
preserve the DB semantics of the original database. Nasty, but there
you go)

I know each of the back-end processes is going to suck down some
resources on the server, but am I going to hit coordination or
inter-process sync delays with that many different back ends going at
once? (And is there a good way, short of just running some load
tests, to estimate the costs involved?)
--
                Dan

--------------------------------------it's like this-------------------
Dan Sugalski                          even samurai
dan@sidhe.org                         have teddy bears and even
                                       teddy bears get drunk

Re: Server load planning

From
Thomas F.O'Connell
Date:
Dan,

You can get a sense of how much memory you will need by the shorthand
presented in table 16-2 for calculating the value of SHMMAX:

http://www.postgresql.org/docs/8.0/static/kernel-
resources.html#SYSVIPC-PARAMETERS

Otherwise, you'll need to include some estimate of work_mem and
maintenance_work_mem based on your knowledge of your queries:

http://www.postgresql.org/docs/8.0/static/runtime-config.html#RUNTIME-
CONFIG-RESOURCE

As far as disk I/O and contention at that level, I'm not sure how that
will be affected by sheer number of connections. There's a simple
utility in contrib called pgbench that you could use to do some
testing.

-tfo

--
Thomas F. O'Connell
Co-Founder, Information Architect
Sitening, LLC

Strategic Open Source — Open Your i™

http://www.sitening.com/
110 30th Avenue North, Suite 6
Nashville, TN 37203-6320
615-260-0005

On Mar 27, 2005, at 11:38 AM, Dan Sugalski wrote:

> I'm trying to get a handle on how an app I'm looking to roll out's
> going to impact the server I'm connecting to, and what sort of
> capacity planning's going to be needed to make it all work relatively
> well.
>
> I'm looking at around 250-300 simultaneous users, nearly all of them
> doing interactive work. (Curses-based screen stuff for the most part)
> I'd not too worried about the server we've got for them being able to
> handle that except... for reasons that are fairly annoying, I'm
> looking at somewhere in excess of 9K simultaneous connections to the
> database server, and I'm not in a position to cut that down any. (The
> app suite's written in an old 4GL that assumes an ISAM database. We're
> porting to a modern database and runtime, but we have to preserve the
> DB semantics of the original database. Nasty, but there you go)
>
> I know each of the back-end processes is going to suck down some
> resources on the server, but am I going to hit coordination or
> inter-process sync delays with that many different back ends going at
> once? (And is there a good way, short of just running some load tests,
> to estimate the costs involved?)