Re: Backup Large Tables - Mailing list pgsql-general

From Casey Duncan
Subject Re: Backup Large Tables
Date
Msg-id DFBEF250-DD70-4987-9443-660ABAC141A7@pandora.com
Whole thread Raw
In response to Re: Backup Large Tables  ("Charles Ambrose" <jamjam360@gmail.com>)
List pgsql-general
Are you dumping the whole database or just a single table? If it's
the former, try the latter and see if you still get errors.

If pg_dump is not working, maybe some system table is hosed. What
errors are you getting?

If you can get in via psql, log in as a superuser and execute:

COPY mytable TO 'mytable.txt';

That will dump the table data to a text file which can be re-imported
into a new database using the COPY FROM command. Basically you're
just doing part of what pg_dump does for you by hand.

-Casey

On Sep 21, 2006, at 9:19 PM, Charles Ambrose wrote:

> Hi!
>
> I encounter errors in dumping the database using pg_dump. The
> database i think is corrupt. It was looking for triggers and stored
> procedures that are now longer in the database. This is also the
> reason why I opted to create a program to dump the database.
>
> On 9/22/06, Michael Nolan <htfoot@gmail.com> wrote: I have a table
> with over 6 million rows in it that I do a dump on every night.  It
> takes less than 2 minutes to create a file that is around 650 MB.
>
> Are you maybe dumping this file in 'insert' mode?
> --
> Mike Nolan
>
>
> On 9/21/06, Charles Ambrose < jamjam360@gmail.com> wrote: Hi!
>
> I have a fairly large database tables (say an average of  3Million
> to 4Million records).  Using the pg_dump utility takes forever to
> dump the database tables. As an alternative, I have created a
> program that gets all the data from the table and then put it into
> a text file. I was also unsuccessfull in this alternative to dump
> the database.
>
>
>
>


pgsql-general by date:

Previous
From: Shane Ambler
Date:
Subject: Re: After Trigger
Next
From: Jim Nasby
Date:
Subject: Re: Garbage data sent by Windows 98 client