Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects - Mailing list pgsql-admin

From Giuseppe Broccolo
Subject Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects
Date
Msg-id 524AAB93.7070308@2ndquadrant.it
Whole thread Raw
In response to PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects  (Sergey Klochkov <klochkov@iqbuzz.ru>)
Responses Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects
Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects
List pgsql-admin
Maybe you can performe your database changing some parameters properly:
>
> PostgreSQL configuration:
>
> listen_addresses = '*'          # what IP address(es) to listen on;
> port = 5432                             # (change requires restart)
> max_connections = 500                   # (change requires restart)
Set it to 100, the highest value supported by PostgreSQL
> shared_buffers = 16GB                  # min 128kB
This value should not be higher than 8GB
> temp_buffers = 64MB                     # min 800kB
> work_mem = 512MB                        # min 64kB
> maintenance_work_mem = 30000MB          # min 1MB
Given RAM 96GB, you could set it up to 4800MB
> checkpoint_segments = 70                # in logfile segments, min 1,
> 16MB each
> effective_cache_size = 50000MB
Given RAM 96GB, you could set it up to 80GB
>

Hope it can help.

Giuseppe.

--
Giuseppe Broccolo - 2ndQuadrant Italy
PostgreSQL Training, Services and Support
giuseppe.broccolo@2ndQuadrant.it | www.2ndQuadrant.it



pgsql-admin by date:

Previous
From: Sergey Klochkov
Date:
Subject: Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects
Next
From: Sergey Klochkov
Date:
Subject: Re: PostgreSQL 9.2 - pg_dump out of memory when backuping a database with 300000000 large objects