Thread: pg_dump problem

pg_dump problem

From
Mathieu Arnold
Date:
Hi

I have a table with more than 4M entries and pg_dump refuses to dump it,
it just sits on it waiting. As i'm backuping my db with pg_dump, I do
have a problem here.

$ postmaster -V
postmaster (PostgreSQL) 7.1
$ pg_dump -V
pg_dump (PostgreSQL) 7.1

--
Mathieu Arnold

Re: pg_dump problem

From
Andrew Gould
Date:
Please show us how you executed pg_dump.  That is,
what syntax did you use?

Andrew Gould
--- Mathieu Arnold <arn_mat@club-internet.fr> wrote:
> Hi
>
> I have a table with more than 4M entries and pg_dump
> refuses to dump it,
> it just sits on it waiting. As i'm backuping my db
> with pg_dump, I do
> have a problem here.
>
> $ postmaster -V
> postmaster (PostgreSQL) 7.1
> $ pg_dump -V
> pg_dump (PostgreSQL) 7.1
>
> --
> Mathieu Arnold
>
> ---------------------------(end of
> broadcast)---------------------------
> TIP 2: you can get off all lists at once with the
> unregister command
>     (send "unregister YourEmailAddressHere" to
majordomo@postgresql.org)


__________________________________________________
Do You Yahoo!?
Get personalized email addresses from Yahoo! Mail - only $35
a year!  http://personal.mail.yahoo.com/

Re: pg_dump problem

From
Mathieu Arnold
Date:

Andrew Gould wrote:
>
> Please show us how you executed pg_dump.  That is,
> what syntax did you use?
>
> Andrew Gould
> --- Mathieu Arnold <arn_mat@club-internet.fr> wrote:
> > Hi
> >
> > I have a table with more than 4M entries and pg_dump
> > refuses to dump it,
> > it just sits on it waiting. As i'm backuping my db
> > with pg_dump, I do
> > have a problem here.
> >
> > $ postmaster -V
> > postmaster (PostgreSQL) 7.1
> > $ pg_dump -V
> > pg_dump (PostgreSQL) 7.1

pg_dumpall -c -N -D

--
Mathieu Arnold

Re: pg_dump problem

From
Andrew Gould
Date:
Give the command a place to put the data dump.  For
Example:

pg_dumpall -c -N -D > backup

or, if you need it compressed (and assuming you're
running Unix or Linux), pipe it through gzip:

pg_dumpall -c -N -D | gzip -c > backup.gz


Best of luck,

Andrew Gould

--- Mathieu Arnold <arn_mat@club-internet.fr> wrote:
>
>
> Andrew Gould wrote:
> >
> > Please show us how you executed pg_dump.  That is,
> > what syntax did you use?
> >
> > Andrew Gould
> > --- Mathieu Arnold <arn_mat@club-internet.fr>
> wrote:
> > > Hi
> > >
> > > I have a table with more than 4M entries and
> pg_dump
> > > refuses to dump it,
> > > it just sits on it waiting. As i'm backuping my
> db
> > > with pg_dump, I do
> > > have a problem here.
> > >
> > > $ postmaster -V
> > > postmaster (PostgreSQL) 7.1
> > > $ pg_dump -V
> > > pg_dump (PostgreSQL) 7.1
>
> pg_dumpall -c -N -D
>
> --
> Mathieu Arnold
>
> ---------------------------(end of
> broadcast)---------------------------
> TIP 1: subscribe and unsubscribe commands go to
majordomo@postgresql.org


__________________________________________________
Do You Yahoo!?
Get personalized email addresses from Yahoo! Mail - only $35
a year!  http://personal.mail.yahoo.com/

Re: pg_dump problem

From
Mathieu Arnold
Date:

Andrew Gould wrote:
>
> Give the command a place to put the data dump.  For
> Example:
>
> pg_dumpall -c -N -D > backup
>
> or, if you need it compressed (and assuming you're
> running Unix or Linux), pipe it through gzip:
>
> pg_dumpall -c -N -D | gzip -c > backup.gz
>
> Best of luck,

well, it's already doing this :
/opt/pg/bin/pg_dumpall -c -N -D |gzip -c9 > $SAVE$DATE-blemish-pgsql.tgz
but the fact is that when it arrives on the big table, it just sits
there doing nothing (and i mean nothing). It was still working with
about 3M rows.

--
Mathieu Arnold

Re: pg_dump problem

From
Andrew Gould
Date:
What's the status of the server?  Is the file size
still increasing?

Did you vacuum the databases prior to dumping?

Andrew

--- Mathieu Arnold <arn_mat@club-internet.fr> wrote:
>
>
> Andrew Gould wrote:
> >
> > Give the command a place to put the data dump.
> For
> > Example:
> >
> > pg_dumpall -c -N -D > backup
> >
> > or, if you need it compressed (and assuming you're
> > running Unix or Linux), pipe it through gzip:
> >
> > pg_dumpall -c -N -D | gzip -c > backup.gz
> >
> > Best of luck,
>
> well, it's already doing this :
> /opt/pg/bin/pg_dumpall -c -N -D |gzip -c9 >
> $SAVE$DATE-blemish-pgsql.tgz
> but the fact is that when it arrives on the big
> table, it just sits
> there doing nothing (and i mean nothing). It was
> still working with
> about 3M rows.
>
> --
> Mathieu Arnold


__________________________________________________
Do You Yahoo!?
Get personalized email addresses from Yahoo! Mail - only $35
a year!  http://personal.mail.yahoo.com/

Re: pg_dump problem

From
Mathieu Arnold
Date:

Andrew Gould wrote:
>
> What's the status of the server?  Is the file size
> still increasing?
>
> Did you vacuum the databases prior to dumping?

feeling stupid, was running out of ram

pg        3314  5.3 76.7 702304 197816 ttyp1 S    16:00   0:43
|           \_ /opt/pg/bin/pg_dump -c -N -D -Fp backup
pg_dump is taking 700M of ram...
I won't say that's a good thing, but i found out where the problem was.
I don't know how i'm gonna resolve it though.

--
Mathieu Arnold

Re: pg_dump problem

From
Tom Lane
Date:
Mathieu Arnold <arn_mat@club-internet.fr> writes:
> pg_dumpall -c -N -D

Try it without the -D ... that option requires pg_dump to SELECT the
whole table contents into its memory.

            regards, tom lane

Re: pg_dump problem

From
Mathieu Arnold
Date:
Tom Lane wrote:
>
> Mathieu Arnold <arn_mat@club-internet.fr> writes:
> > pg_dumpall -c -N -D
>
> Try it without the -D ... that option requires pg_dump to SELECT the
> whole table contents into its memory.

well, without -D, i sometime have restore problems :p
I believe it could be done otherwise, but i don't figure how it could :)

--
Mathieu Arnold

Re: pg_dump problem

From
Andrew Gould
Date:
You can dump individual databases, even individual
tables, separately.  Would this help?

Andrew

--- Mathieu Arnold <arn_mat@club-internet.fr> wrote:
>
>
> Andrew Gould wrote:
> >
> > What's the status of the server?  Is the file size
> > still increasing?
> >
> > Did you vacuum the databases prior to dumping?
>
> feeling stupid, was running out of ram
>
> pg        3314  5.3 76.7 702304 197816 ttyp1 S
> 16:00   0:43
> |           \_ /opt/pg/bin/pg_dump -c -N -D -Fp
> backup
> pg_dump is taking 700M of ram...
> I won't say that's a good thing, but i found out
> where the problem was.
> I don't know how i'm gonna resolve it though.
>
> --
> Mathieu Arnold
>
> ---------------------------(end of
> broadcast)---------------------------
> TIP 6: Have you searched our list archives?
>
> http://www.postgresql.org/search.mpl


__________________________________________________
Do You Yahoo!?
Get personalized email addresses from Yahoo! Mail - only $35
a year!  http://personal.mail.yahoo.com/