Thread: out of memory in backup and restore

out of memory in backup and restore

From
Thomas Markus
Date:
Hi,

i'm running pg 8.1.0 on a debian linux (64bit) box (dual xeon 8gb ram)
pg_dump creates an error when exporting a large table with blobs
(largest blob is 180mb)

error is:
pg_dump: ERROR:  out of memory
DETAIL:  Failed on request of size 1073741823.
pg_dump: SQL command to dump the contents of table "downloads" failed:
PQendcopy() failed.
pg_dump: Error message from server: ERROR:  out of memory
DETAIL:  Failed on request of size 1073741823.
pg_dump: The command was: COPY public.downloads ...  TO stdout;

if i try pg_dump with -d dump runs with all types (c,t,p), but i cant
restore (out of memory error or corrupt tar header at ...)

how can i backup (and restore) such a db?

kr
Thomas

Re: out of memory in backup and restore

From
"Shoaib Mir"
Date:
Can you please show the dbserver logs and syslog at the same time when it goes out of memory...

Also how much is available RAM you have and the SHMMAX set?

------------
Shoaib Mir
EnterpriseDB ( www.enterprisedb.com)

On 12/15/06, Thomas Markus <t.markus@proventis.net> wrote:
Hi,

i'm running pg 8.1.0 on a debian linux (64bit) box (dual xeon 8gb ram)
pg_dump creates an error when exporting a large table with blobs
(largest blob is 180mb)

error is:
pg_dump: ERROR:  out of memory
DETAIL:  Failed on request of size 1073741823.
pg_dump: SQL command to dump the contents of table "downloads" failed:
PQendcopy() failed.
pg_dump: Error message from server: ERROR:  out of memory
DETAIL:  Failed on request of size 1073741823.
pg_dump: The command was: COPY public.downloads ...  TO stdout;

if i try pg_dump with -d dump runs with all types (c,t,p), but i cant
restore (out of memory error or corrupt tar header at ...)

how can i backup (and restore) such a db?

kr
Thomas

---------------------------(end of broadcast)---------------------------
TIP 4: Have you searched our list archives?

               http://archives.postgresql.org

Re: out of memory in backup and restore

From
Thomas Markus
Date:
Hi,

logfile content see http://www.rafb.net/paste/results/cvD7uk33.html
- cat /proc/sys/kernel/shmmax is 2013265920
- ulimit is unlimited
kernel is 2.6.15-1-em64t-p4-smp, pg version is 8.1.0 32bit
postmaster process usage is 1.8gb ram atm

thx
Thomas


Shoaib Mir schrieb:
> Can you please show the dbserver logs and syslog at the same time when
> it goes out of memory...
>
> Also how much is available RAM you have and the SHMMAX set?
>
> ------------
> Shoaib Mir
> EnterpriseDB ( www.enterprisedb.com <http://www.enterprisedb.com>)
>
> On 12/15/06, *Thomas Markus* <t.markus@proventis.net
> <mailto:t.markus@proventis.net>> wrote:
>
>     Hi,
>
>     i'm running pg 8.1.0 on a debian linux (64bit) box (dual xeon 8gb ram)
>     pg_dump creates an error when exporting a large table with blobs
>     (largest blob is 180mb)
>
>     error is:
>     pg_dump: ERROR:  out of memory
>     DETAIL:  Failed on request of size 1073741823.
>     pg_dump: SQL command to dump the contents of table "downloads" failed:
>     PQendcopy() failed.
>     pg_dump: Error message from server: ERROR:  out of memory
>     DETAIL:  Failed on request of size 1073741823.
>     pg_dump: The command was: COPY public.downloads ...  TO stdout;
>
>     if i try pg_dump with -d dump runs with all types (c,t,p), but i cant
>     restore (out of memory error or corrupt tar header at ...)
>
>     how can i backup (and restore) such a db?
>
>     kr
>     Thomas
>
>     ---------------------------(end of
>     broadcast)---------------------------
>     TIP 4: Have you searched our list archives?
>
>                    http://archives.postgresql.org
>
>

--
Thomas Markus

Tel:    +49 30 29 36 399 - 22
Fax:    +49 30 29 36 399 - 50
Mail:   t.markus@proventis.net
Web:    www.proventis.net
Web:    www.blue-ant.de

proventis GmbH
Zimmerstraße 79-80
10117 Berlin

"proventis: Wir bewegen Organisationen."


Attachment

Re: out of memory in backup and restore

From
"Marcelo Costa"
Date:
Try see your /tmp directory on your server, this maybe can for an left space on your system disk.

[],s

Marcelo.

2006/12/15, Thomas Markus < t.markus@proventis.net>:
Hi,

logfile content see http://www.rafb.net/paste/results/cvD7uk33.html
- cat /proc/sys/kernel/shmmax is 2013265920
- ulimit is unlimited
kernel is 2.6.15-1-em64t-p4-smp, pg version is 8.1.0 32bit
postmaster process usage is 1.8gb ram atm

thx
Thomas


Shoaib Mir schrieb:
> Can you please show the dbserver logs and syslog at the same time when
> it goes out of memory...
>
> Also how much is available RAM you have and the SHMMAX set?
>
> ------------
> Shoaib Mir
> EnterpriseDB ( www.enterprisedb.com <http://www.enterprisedb.com>)
>
> On 12/15/06, *Thomas Markus* <t.markus@proventis.net
> <mailto:t.markus@proventis.net>> wrote:
>
>     Hi,
>
>     i'm running pg 8.1.0 on a debian linux (64bit) box (dual xeon 8gb ram)
>     pg_dump creates an error when exporting a large table with blobs
>     (largest blob is 180mb)
>
>     error is:
>     pg_dump: ERROR:  out of memory
>     DETAIL:  Failed on request of size 1073741823.
>     pg_dump: SQL command to dump the contents of table "downloads" failed:
>     PQendcopy() failed.
>     pg_dump: Error message from server: ERROR:  out of memory
>     DETAIL:  Failed on request of size 1073741823.
>     pg_dump: The command was: COPY public.downloads ...  TO stdout;
>
>     if i try pg_dump with -d dump runs with all types (c,t,p), but i cant
>     restore (out of memory error or corrupt tar header at ...)
>
>     how can i backup (and restore) such a db?
>
>     kr
>     Thomas
>
>     ---------------------------(end of
>     broadcast)---------------------------
>     TIP 4: Have you searched our list archives?
>
>                     http://archives.postgresql.org
>
>

--
Thomas Markus

Tel:    +49 30 29 36 399 - 22
Fax:    +49 30 29 36 399 - 50
Mail:   t.markus@proventis.net
Web:    www.proventis.net
Web:    www.blue-ant.de

proventis GmbH
Zimmerstraße 79-80
10117 Berlin

"proventis: Wir bewegen Organisationen."




---------------------------(end of broadcast)---------------------------
TIP 4: Have you searched our list archives?

               http://archives.postgresql.org






--
Marcelo Costa

Re: out of memory in backup and restore

From
"Shoaib Mir"
Date:
Looks like with 1.8 GB usage not much left for dump to get the required chunk from memory. Not sure if that will help but try increasing the swap space...

-------------
Shoaib Mir
EnterpriseDB ( www.enterprisedb.com)

On 12/15/06, Thomas Markus <t.markus@proventis.net> wrote:
Hi,

logfile content see http://www.rafb.net/paste/results/cvD7uk33.html
- cat /proc/sys/kernel/shmmax is 2013265920
- ulimit is unlimited
kernel is 2.6.15-1-em64t-p4-smp, pg version is 8.1.0 32bit
postmaster process usage is 1.8gb ram atm

thx
Thomas


Shoaib Mir schrieb:
> Can you please show the dbserver logs and syslog at the same time when
> it goes out of memory...
>
> Also how much is available RAM you have and the SHMMAX set?
>
> ------------
> Shoaib Mir
> EnterpriseDB ( www.enterprisedb.com <http://www.enterprisedb.com>)
>
> On 12/15/06, *Thomas Markus* <t.markus@proventis.net
> <mailto: t.markus@proventis.net>> wrote:
>
>     Hi,
>
>     i'm running pg 8.1.0 on a debian linux (64bit) box (dual xeon 8gb ram)
>     pg_dump creates an error when exporting a large table with blobs
>     (largest blob is 180mb)
>
>     error is:
>     pg_dump: ERROR:  out of memory
>     DETAIL:  Failed on request of size 1073741823.
>     pg_dump: SQL command to dump the contents of table "downloads" failed:
>     PQendcopy() failed.
>     pg_dump: Error message from server: ERROR:  out of memory
>     DETAIL:  Failed on request of size 1073741823.
>     pg_dump: The command was: COPY public.downloads ...  TO stdout;
>
>     if i try pg_dump with -d dump runs with all types (c,t,p), but i cant
>     restore (out of memory error or corrupt tar header at ...)
>
>     how can i backup (and restore) such a db?
>
>     kr
>     Thomas
>
>     ---------------------------(end of
>     broadcast)---------------------------
>     TIP 4: Have you searched our list archives?
>
>                     http://archives.postgresql.org
>
>

--
Thomas Markus

Tel:    +49 30 29 36 399 - 22
Fax:    +49 30 29 36 399 - 50
Mail:   t.markus@proventis.net
Web:    www.proventis.net
Web:    www.blue-ant.de

proventis GmbH
Zimmerstraße 79-80
10117 Berlin

"proventis: Wir bewegen Organisationen."




---------------------------(end of broadcast)---------------------------
TIP 4: Have you searched our list archives?

               http://archives.postgresql.org




Re: out of memory in backup and restore

From
Thomas Markus
Date:
Hi,

free diskspace is 34gb (underlying xfs) (complete db dump is 9gb). free
-tm says 6gb free ram and 6gb unused swap space.
can i decrease shared buffers without pg restart?

thx
Thomas

Shoaib Mir schrieb:
> Looks like with 1.8 GB usage not much left for dump to get the
> required chunk from memory. Not sure if that will help but try
> increasing the swap space...
>
> -------------
> Shoaib Mir
> EnterpriseDB ( www.enterprisedb.com <http://www.enterprisedb.com>)
>
> On 12/15/06, *Thomas Markus* <t.markus@proventis.net
> <mailto:t.markus@proventis.net>> wrote:
>
>     Hi,
>
>     logfile content see http://www.rafb.net/paste/results/cvD7uk33.html
>     - cat /proc/sys/kernel/shmmax is 2013265920
>     - ulimit is unlimited
>     kernel is 2.6.15-1-em64t-p4-smp, pg version is 8.1.0 32bit
>     postmaster process usage is 1.8gb ram atm
>
>     thx
>     Thomas
>
>
>     Shoaib Mir schrieb:
>     > Can you please show the dbserver logs and syslog at the same
>     time when
>     > it goes out of memory...
>     >
>     > Also how much is available RAM you have and the SHMMAX set?
>     >
>     > ------------
>     > Shoaib Mir
>     > EnterpriseDB ( www.enterprisedb.com
>     <http://www.enterprisedb.com> <http://www.enterprisedb.com>)
>     >
>     > On 12/15/06, *Thomas Markus* <t.markus@proventis.net
>     <mailto:t.markus@proventis.net>
>     > <mailto: t.markus@proventis.net
>     <mailto:t.markus@proventis.net>>> wrote:
>     >
>     >     Hi,
>     >
>     >     i'm running pg 8.1.0 on a debian linux (64bit) box (dual
>     xeon 8gb ram)
>     >     pg_dump creates an error when exporting a large table with
>     blobs
>     >     (largest blob is 180mb)
>     >
>     >     error is:
>     >     pg_dump: ERROR:  out of memory
>     >     DETAIL:  Failed on request of size 1073741823.
>     >     pg_dump: SQL command to dump the contents of table
>     "downloads" failed:
>     >     PQendcopy() failed.
>     >     pg_dump: Error message from server: ERROR:  out of memory
>     >     DETAIL:  Failed on request of size 1073741823.
>     >     pg_dump: The command was: COPY public.downloads ...  TO stdout;
>     >
>     >     if i try pg_dump with -d dump runs with all types (c,t,p),
>     but i cant
>     >     restore (out of memory error or corrupt tar header at ...)
>     >
>     >     how can i backup (and restore) such a db?
>     >
>     >     kr
>     >     Thomas
>     >
>     >     ---------------------------(end of
>     >     broadcast)---------------------------
>     >     TIP 4: Have you searched our list archives?
>     >
>     >                     http://archives.postgresql.org
>     >
>     >
>
>

Re: out of memory in backup and restore

From
"Marcelo Costa"
Date:
To decrease shared buffers you need restart your pgsql.

If do you make on df -h command what is the result, please send.

2006/12/15, Thomas Markus < t.markus@proventis.net>:
Hi,

free diskspace is 34gb (underlying xfs) (complete db dump is 9gb). free
-tm says 6gb free ram and 6gb unused swap space.
can i decrease shared buffers without pg restart?

thx
Thomas

Shoaib Mir schrieb:
> Looks like with 1.8 GB usage not much left for dump to get the
> required chunk from memory. Not sure if that will help but try
> increasing the swap space...
>
> -------------
> Shoaib Mir
> EnterpriseDB ( www.enterprisedb.com <http://www.enterprisedb.com>)
>
> On 12/15/06, *Thomas Markus* <t.markus@proventis.net
> <mailto: t.markus@proventis.net>> wrote:
>
>     Hi,
>
>     logfile content see http://www.rafb.net/paste/results/cvD7uk33.html
>     - cat /proc/sys/kernel/shmmax is 2013265920
>     - ulimit is unlimited
>     kernel is 2.6.15-1-em64t-p4-smp, pg version is 8.1.0 32bit
>     postmaster process usage is 1.8gb ram atm
>
>     thx
>     Thomas
>
>
>     Shoaib Mir schrieb:
>     > Can you please show the dbserver logs and syslog at the same
>     time when
>     > it goes out of memory...
>     >
>     > Also how much is available RAM you have and the SHMMAX set?
>     >
>     > ------------
>     > Shoaib Mir
>     > EnterpriseDB ( www.enterprisedb.com
>     < http://www.enterprisedb.com> <http://www.enterprisedb.com>)
>     >
>     > On 12/15/06, *Thomas Markus* < t.markus@proventis.net
>     <mailto:t.markus@proventis.net>
>     > <mailto: t.markus@proventis.net
>     <mailto: t.markus@proventis.net>>> wrote:
>     >
>     >     Hi,
>     >
>     >     i'm running pg 8.1.0 on a debian linux (64bit) box (dual
>     xeon 8gb ram)
>     >     pg_dump creates an error when exporting a large table with
>     blobs
>     >     (largest blob is 180mb)
>     >
>     >     error is:
>     >     pg_dump: ERROR:  out of memory
>     >     DETAIL:  Failed on request of size 1073741823.
>     >     pg_dump: SQL command to dump the contents of table
>     "downloads" failed:
>     >     PQendcopy() failed.
>     >     pg_dump: Error message from server: ERROR:  out of memory
>     >     DETAIL:  Failed on request of size 1073741823.
>     >     pg_dump: The command was: COPY public.downloads ...  TO stdout;
>     >
>     >     if i try pg_dump with -d dump runs with all types (c,t,p),
>     but i cant
>     >     restore (out of memory error or corrupt tar header at ...)
>     >
>     >     how can i backup (and restore) such a db?
>     >
>     >     kr
>     >     Thomas
>     >
>     >     ---------------------------(end of
>     >     broadcast)---------------------------
>     >     TIP 4: Have you searched our list archives?
>     >
>     >                     http://archives.postgresql.org
>     >
>     >
>
>

---------------------------(end of broadcast)---------------------------
TIP 4: Have you searched our list archives?

               http://archives.postgresql.org



--
Marcelo Costa

Re: out of memory in backup and restore

From
Thomas Markus
Date:
df -h

Filesystem            Size  Used Avail Use% Mounted on
/dev/sda5             132G   99G   34G  75% /
tmpfs                 4.0G     0  4.0G   0% /dev/shm
/dev/sda1              74M   16M   54M  23% /boot


is there another dump tool that dumps blobs (or all) as binary content
(not as insert statements, maybe directly dbblocks)?




Marcelo Costa schrieb:
> To decrease shared buffers you need restart your pgsql.
>
> If do you make on df -h command what is the result, please send.
>
> 2006/12/15, Thomas Markus < t.markus@proventis.net
> <mailto:t.markus@proventis.net>>:
>
>     Hi,
>
>     free diskspace is 34gb (underlying xfs) (complete db dump is 9gb).
>     free
>     -tm says 6gb free ram and 6gb unused swap space.
>     can i decrease shared buffers without pg restart?
>
>     thx
>     Thomas
>
>     Shoaib Mir schrieb:
>     > Looks like with 1.8 GB usage not much left for dump to get the
>     > required chunk from memory. Not sure if that will help but try
>     > increasing the swap space...
>     >
>     > -------------
>     > Shoaib Mir
>     > EnterpriseDB ( www.enterprisedb.com
>     <http://www.enterprisedb.com> <http://www.enterprisedb.com>)
>     >
>     > On 12/15/06, *Thomas Markus* <t.markus@proventis.net
>     <mailto:t.markus@proventis.net>
>     > <mailto: t.markus@proventis.net
>     <mailto:t.markus@proventis.net>>> wrote:
>     >
>     >     Hi,
>     >
>     >     logfile content see
>     http://www.rafb.net/paste/results/cvD7uk33.html
>     >     - cat /proc/sys/kernel/shmmax is 2013265920
>     >     - ulimit is unlimited
>     >     kernel is 2.6.15-1-em64t-p4-smp, pg version is 8.1.0 32bit
>     >     postmaster process usage is 1.8gb ram atm
>     >
>     >     thx
>     >     Thomas
>     >
>     >
>     >     Shoaib Mir schrieb:
>     >     > Can you please show the dbserver logs and syslog at the same
>     >     time when
>     >     > it goes out of memory...
>     >     >
>     >     > Also how much is available RAM you have and the SHMMAX set?
>     >     >
>     >     > ------------
>     >     > Shoaib Mir
>     >     > EnterpriseDB ( www.enterprisedb.com
>     <http://www.enterprisedb.com>
>     >     < http://www.enterprisedb.com> <http://www.enterprisedb.com>)
>     >     >
>     >     > On 12/15/06, *Thomas Markus* < t.markus@proventis.net
>     <mailto:t.markus@proventis.net>
>     >     <mailto:t.markus@proventis.net <mailto:t.markus@proventis.net>>
>     >     > <mailto: t.markus@proventis.net
>     <mailto:t.markus@proventis.net>
>     >     <mailto: t.markus@proventis.net
>     <mailto:t.markus@proventis.net>>>> wrote:
>     >     >
>     >     >     Hi,
>     >     >
>     >     >     i'm running pg 8.1.0 on a debian linux (64bit) box (dual
>     >     xeon 8gb ram)
>     >     >     pg_dump creates an error when exporting a large table with
>     >     blobs
>     >     >     (largest blob is 180mb)
>     >     >
>     >     >     error is:
>     >     >     pg_dump: ERROR:  out of memory
>     >     >     DETAIL:  Failed on request of size 1073741823.
>     >     >     pg_dump: SQL command to dump the contents of table
>     >     "downloads" failed:
>     >     >     PQendcopy() failed.
>     >     >     pg_dump: Error message from server: ERROR:  out of memory
>     >     >     DETAIL:  Failed on request of size 1073741823.
>     >     >     pg_dump: The command was: COPY public.downloads
>     ...  TO stdout;
>     >     >
>     >     >     if i try pg_dump with -d dump runs with all types (c,t,p),
>     >     but i cant
>     >     >     restore (out of memory error or corrupt tar header at
>     ...)
>     >     >
>     >     >     how can i backup (and restore) such a db?
>     >     >
>     >     >     kr
>     >     >     Thomas
>     >     >
>     >     >     ---------------------------(end of
>     >     >     broadcast)---------------------------
>     >     >     TIP 4: Have you searched our list archives?
>     >     >
>     >     >                     http://archives.postgresql.org
>     <http://archives.postgresql.org>
>     >     >
>     >     >
>     >
>     >
>
>     ---------------------------(end of
>     broadcast)---------------------------
>     TIP 4: Have you searched our list archives?
>
>                    http://archives.postgresql.org
>
>
>
>
> --
> Marcelo Costa

Re: out of memory in backup and restore

From
Tom Lane
Date:
Thomas Markus <t.markus@proventis.net> writes:
> logfile content see http://www.rafb.net/paste/results/cvD7uk33.html

It looks to me like you must have individual rows whose COPY
representation requires more than half a gigabyte (maybe much more,
but at least that) and the system cannot allocate enough buffer space.

It could be that this is a symptom of corrupted data, if you're certain
that there shouldn't be any such rows in the table.

> kernel is 2.6.15-1-em64t-p4-smp, pg version is 8.1.0 32bit

You really need a 64-bit PG build if you want to push
multi-hundred-megabyte field values around --- otherwise there's just
not enough headroom in the process address space.  (Something newer than
8.1.0 would be a good idea too.)

            regards, tom lane

Re: out of memory in backup and restore

From
Thomas Markus
Date:
Hi,

i tried various ways to backup that db.
if i use a separate 'copy table to 'file' with binary' i can export the
problematic table and restore without problems. resulting outputfile is
much smaller than default output and runtime is much shorter.
is there any way to say pg_dump to use a copy command with option 'with
binary'? it should be possible with the custom or tar format. i searched
the docs and manpage and cant find something.

thx
Thomas

Tom Lane schrieb:
> Thomas Markus <t.markus@proventis.net> writes:
>
>> logfile content see http://www.rafb.net/paste/results/cvD7uk33.html
>>
>
> It looks to me like you must have individual rows whose COPY
> representation requires more than half a gigabyte (maybe much more,
> but at least that) and the system cannot allocate enough buffer space.
>
yes, msg is DETAIL:  Failed on request of size 546321213. (521mb)
> It could be that this is a symptom of corrupted data, if you're certain
> that there shouldn't be any such rows in the table.
>
no
>
>> kernel is 2.6.15-1-em64t-p4-smp, pg version is 8.1.0 32bit
>>
>
> You really need a 64-bit PG build if you want to push
> multi-hundred-megabyte field values around --- otherwise there's just
> not enough headroom in the process address space.  (Something newer than
> 8.1.0 would be a good idea too.)
>
i cant change the db installation. but thats another problem
>             regards, tom lane
>