Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects - Mailing list pgsql-admin

From bricklen
Subject Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects
Date
Msg-id CAGrpgQ_safsytHcJyBwo2fT6Eu01=hJwjiZ2juac1vJQRqCjfg@mail.gmail.com
Whole thread Raw
In response to Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects  (Giuseppe Broccolo <giuseppe.broccolo@2ndquadrant.it>)
List pgsql-admin

On Tue, Oct 1, 2013 at 4:01 AM, Giuseppe Broccolo <giuseppe.broccolo@2ndquadrant.it> wrote:
Maybe you can performe your database changing some parameters properly:

max_connections = 500                   # (change requires restart)
Set it to 100, the highest value supported by PostgreSQL

Surely you mean that  max_connections = 100 is the *default* ?

pgsql-admin by date:

Previous
From: Sergey Klochkov
Date:
Subject: Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects
Next
From: Magnus Hagander
Date:
Subject: Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects