If there are potentially hundreds of clients at a time, then you may be
running into the maximum connection limit.
In postgresql.conf, there is a max_connections setting which IIRC
defaults to 100. If you try to open more concurrent connections to the
backend than that, you will get a connection refused.
If your DB is fairly gnarly and your performance needs are minimal it
should be safe to increase max_connections. An alternative approach
would be to add some kind of database broker program. Instead of each
agent connecting directly to the database, they could pass their data to
a broker, which could then implement connection pooling.
-- Mark Lewis
On Tue, 2005-04-12 at 22:09, Slavisa Garic wrote:
> This is a serious problem for me as there are multiple users using our
> software on our server and I would want to avoid having connections
> open for a long time. In the scenario mentioned below I haven't
> explained the magnitute of the communications happening between Agents
> and DBServer. There could possibly be 100 or more Agents per
> experiment, per user running on remote machines at the same time,
> hence we need short transactions/pgsql connections. Agents need a
> reliable connection because failure to connect could mean a loss of
> computation results that were gathered over long periods of time.