Re: pg_dump out of memory - Mailing list pgsql-general

From George Neuner
Subject Re: pg_dump out of memory
Date
Msg-id 5ajojdh1qdj8ieftktcp95fo8hcq96m9pj@4ax.com
Whole thread Raw
In response to pg_dump out of memory  (Andy Colson <andy@squeakycode.net>)
List pgsql-general
On Tue, 3 Jul 2018 21:43:38 -0500, Andy Colson <andy@squeakycode.net>
wrote:

>Hi All,
>
>I moved a physical box to a VM, and set its memory to 1Gig.  Everything
>runs fine except one backup:
>
>
>/pub/backup# pg_dump -Fc -U postgres -f wildfire.backup wildfirep
>
>g_dump: Dumping the contents of table "ofrrds" failed: PQgetResult() failed.
>pg_dump: Error message from server: ERROR:  out of memory
>DETAIL:  Failed on request of size 1073741823.
                                    ^^^^^^^^^^

pg_dump is trying to allocate 1GB.  Obviously it can't if 1GB is all
you have.


>pg_dump: The command was: COPY public.ofrrds (id, updateddate, bytes) TO
>stdout;
>
>wildfire=# \dt+ ofrrds
>                    List of relations
>  Schema |  Name  | Type  | Owner | Size  | Description
>--------+--------+-------+-------+-------+-------------
>  public | ofrrds | table | andy  | 15 MB |
>
>
>wildfire=# \d ofrrds
>               Table "public.ofrrds"
>    Column    |          Type          | Modifiers
>-------------+------------------------+-----------
>  id          | character varying(100) | not null
>  updateddate | bigint                 | not null
>  bytes       | bytea                  |
>Indexes:
>     "ofrrds_pk" PRIMARY KEY, btree (id)
>

There must be a heck of a lot of data in that bytea column.


>I'm not sure how to get this backup to run.  Any hints would be appreciated.

As Adrian mentioned already, you're going to have to give it more
memory somehow.  Either more RAM or a big swap file.

George



pgsql-general by date:

Previous
From: Adrian Klaver
Date:
Subject: Re: pg_dump out of memory
Next
From: jbrant
Date:
Subject: Re: Parallel Aware