Re: Persistent Connections - Mailing list pgsql-general

From Tim Cross
Subject Re: Persistent Connections
Date
Msg-id 87pn9ooxhm.fsf@gmail.com
Whole thread Raw
In response to Re: Persistent Connections  ("Bee.Lists" <bee.lists@gmail.com>)
List pgsql-general
Bee.Lists <bee.lists@gmail.com> writes:

>> On Jun 23, 2020, at 8:09 PM, Tim Cross <theophilusx@gmail.com> wrote:
>>
>> Sounds like your web app may not be closing connections once it has
>> finished with them? The fact your seeing idle connections would seem to
>> support this. I would be verifying there isn't a code path in your
>> client application which is failing to close a connection correctly. Seeing
>> connections go up and down in the gem may not be telling you the full story - could
>> be that your client connection objects are being destroyed in your app,
>> but are not closing the connection correctly so PG is unaware the client
>> has finished.
>
> Hi Tim.  I can’t speak for the gem.  I’m assuming its garbage collection is working.  But yes, it does look that way.
I found someone else who was having similar issues as myself: 
>
> https://stackoverflow.com/questions/60843123/djangopostgres-fatal-sorry-too-many-clients-already
>
> I’m also seeing the connection count rise overnight from crontabs.
>
> For some clarity, the gem is Sequel, which is on top of the PG gem (Ruby).  I’ve spoken to the Sequel author and he
sayseverything is fine.  I have some evidence it’s a connection issue and the gem is doing its job, as I’m seeing it’s
happeningelsewhere with crontabs and other clients.   
>
Rather than a problem with the libraries, I would be looking for a
problem with the code which uses those libraries. If it was a problem
with either the Sequel or PG gems (or with Postgres for that matter), it
would be a lot more wide spread and you would be seeing a lot more
reports.

I'm not familiar with Sequel and haven't used Ruby for nearly 20 years,
but have used plenty of other PG libraries. You mention garbage
collection and I'm sure that is working fine in Ruby. However, you
cannot rely on that to correctly cleanup your PG connections. Somewhere
in your code, there has to be code the developer writes which tells the
library you are finished with the connection. For example, the JS PG
package has the command 'disconnect'. Essentially, your code needs to
tell the remote PG server you have finished with the connection so that
it knows it can clean up things on its end. If your code is not issuing
explicit disconnect commands, what is happening is that the connection
on your client side are being cleanup when the connection object goes
out of scope and the garbage collector kicks in. However, on the PG
side, the connections hang around until PG times them out, which takes
much longer and could easily cause you to hit the limit, especially as
you have such a very small limit.  I think you need to verify that in
all your client code, somewhere there is explicit code being called
which is telling PG you are disconnecting the connection. Don't assume
this is automagically happening as part of GC.
>
>> Typically, due to the overhead of making a connection, you don't want
>> your client app/web app to create a new connection for every query.
>> Instead, you would use some type of connection pool. Many development
>> languages support some form of pooling as part of their db connection
>> library (don't know about Ruby, but JS, Python, PHP, Java, Perl all do)
>> and there are external 3rd party solutions like pgbouncer which sit
>> between your client and the database and work a bit like a connection
>> broker which will manage a pool of connections.
>
> That’s why I’m thinking installing a connection pooler would solve all of this.  pgbouncer is what I’m looking at
now. 
>

I doubt this will solve your problem. It might hide the problem or it
might lengthen the time between failures, but it is very unlikely to
solve the problem. It may help identify the source of the problem.

Have you verified the PG gem doesn't support pooling? If it does, that
would be far easier to configure and use than installing
pgbouncer. Based on your description of the app and the small number of
connections you have PG configured for, adding pgbouncer is like putting
a fighter jet engine in a family SUV.

>> From the description of what your doing, I would first look to see what
>> level of connection pooling your development language supports. This
>> will likely be easier to configure and use than setting up a whole
>> additional bit of infrastructure which is probably far more powerful
>> than you need.
>
> I will do that.  This is all new.
>
>> I would also go through your code and make sure that every time you
>> create a database connection, there is some code which is closing that
>> connection once your finished with it. This is a frequent source of
>> problems, especially during development when your code might throw an
>> error and the code you have to disconnect does not get executed or you
>> simply forget to issue a disconnect when your finished.
>
> The issue here is that it’s all in the gem.  The gem is actually an ORM, built on top of the PG gem.  So all the
databaseconnection stuff is inside there.  It’s so abstracted that I don’t know when the actual calls are made.  It’s a
DSLso the workings aren’t evident when making queries.  It was suggested I install a logger to see what and when
queriesare actually made.   
>

I have used ORMs in the past. Personally, I hate them as they always
introduce limitations. However, I would strongly suggest checking the
Sequel API and examples.

I just had a very quick look at the Sequel API. Things I immediately
noticed are

1. It does support connection pooling
2. There is a DB.database#disconnect method. You need to verify your
client code is calling that method before the objects are GC'd.

>> A connection pool can help in tracking down such issues as well. Most
>> pooling solutions will allow you to set a max pool size. In addition to
>> enabling you to 'reserve' a set number of connections for a client, you
>> will know which client seems to be running out of connections, helping
>> to identify the culprit.
>
> That SO link above suggests I’m not alone.  So installing a connection pooler seems the right thing to do.  Good to
knowabout the reservations.  That is better feedback. 
>
>

Use the connection pool provided by the Sequel gem first and see how
that goes. I'm pretty confident the issue will turn out to be in your
code. I would also strongly consider increasing the PG max
connections. What you have configured is extremely low. Default is
100. 50 or 25 would possibly be more reasonable in your case. It is
quite possible your Ruby scripts are keeping the connections open
in-=between HTTP requests if your using something which keeps the
scripts loaded in memory within the web server (common setup to reduce
spin up time on requests).

--
Tim Cross



pgsql-general by date:

Previous
From: Michael Lewis
Date:
Subject: Re: n_distinct off by a factor of 1000
Next
From: Tim Cross
Date:
Subject: Re: Persistent Connections