Re: Persistent Connections - Mailing list pgsql-general

From Tim Cross
Subject Re: Persistent Connections
Date
Msg-id 87366lpbno.fsf@gmail.com
Whole thread Raw
In response to Persistent Connections  ("Bee.Lists" <bee.lists@gmail.com>)
Responses Re: Persistent Connections  ("Bee.Lists" <bee.lists@gmail.com>)
List pgsql-general
Bee.Lists <bee.lists@gmail.com> writes:

> I have an issue with a server (v10) that’s seeing increasing connections until it’s maxxed-out.
>
> max_connections for my 4-core server is set to 12.
>
> I’ve installed pg_stat_activity and pg_stat_statements.
>
> I access this server through a web app, and another client on two machines.  I also send queries through SSH on the
LAN. psql queries indicate there are too many clients already.  I’ve been getting errors from my workstation through a
Rubygem that says dropped connections happen.  Essentially, the server isn’t giving up connections and clients from all
sidesare showing some issues. 
>
> pg_stat_activity has shown a handful of idle queries that are quite simple.  I’ve tested those queries and they seem
fineon a workstation client.  I’m assuming these queries somehow don’t finish and leave the connection open, but I
couldbe wrong.  All of this is new to me.   
>
> It was suggested on Slack that it sounds like my workstation had some TCP issues with these connections, and that it
wasa firewall.  I disabled the firewall and the same happened.  The firewall has been restarted.   
>
> I am running no connection pool tool of any sort, and I’ve never used one.
>
> At this point I don’t know what to even be watching for.  Connections increase and connections “drop”, according to
thegem I’m using.  I have simple queries that are idle and won’t disappear for some reason.   
>
> How can I figure this out so the connections remain within the max_connections limit, and connections are not
dropped?  
>
> Any insight appreciated.
>

Sounds like your web app may not be closing connections once it has
finished with them? The fact your seeing idle connections would seem to
support this. I would be verifying there isn't a code path in your
client application which is failing to close a connection correctly. Seeing
connections go up and down in the gem may not be telling you the full story - could
be that your client connection objects are being destroyed in your app,
but are not closing the connection correctly so PG is unaware the client
has finished.

Typically, due to the overhead of making a connection, you don't want
your client app/web app to create a new connection for every query.
Instead, you would use some type of connection pool. Many development
languages support some form of pooling as part of their db connection
library (don't know about Ruby, but JS, Python, PHP, Java, Perl all do)
and there are external 3rd party solutions like pgbouncer which sit
between your client and the database and work a bit like a connection
broker which will manage a pool of connections.

From the description of what your doing, I would first look to see what
level of connection pooling your development language supports. This
will likely be easier to configure and use than setting up a whole
additional bit of infrastructure which is probably far more powerful
than you need.

I would also go through your code and make sure that every time you
create a database connection, there is some code which is closing that
connection once your finished with it. This is a frequent source of
problems, especially during development when your code might throw an
error and the code you have to disconnect does not get executed or you
simply forget to issue a disconnect when your finished.

A connection pool can help in tracking down such issues as well. Most
pooling solutions will allow you to set a max pool size. In addition to
enabling you to 'reserve' a set number of connections for a client, you
will know which client seems to be running out of connections, helping
to identify the culprit.

--
Tim Cross



pgsql-general by date:

Previous
From: "David G. Johnston"
Date:
Subject: Re: Persistent Connections
Next
From: Tim Cross
Date:
Subject: Re: UPDATE on 20 Million Records Transaction or not?