Thread: Backup Large Tables

Backup Large Tables

From
"Charles Ambrose"
Date:
Hi!

I have a fairly large database tables (say an average of  3Million to 4Million records).  Using the pg_dump utility takes forever to dump the database tables. As an alternative, I have created a program that gets all the data from the table and then put it into a text file. I was also unsuccessfull in this alternative to dump the database.

As I see it, I can dump the tables by gradually getting data and dumping it. I plan to select a number of records from the table then dump it to a text file. This process continues until all records in the table are obtained. With this aproach I need a primary key that uniquely identifies each record so that each pass of getting data from the tables will not get data that has already been processed.
Problem with this approach though is that my dumping utility will not be generic.

Are there any alternatives?

Thanks for help in advance.

Thanks!








Re: Backup Large Tables

From
"Michael Nolan"
Date:
I have a table with over 6 million rows in it that I do a dump on every night.  It takes less than 2 minutes to create a file that is around 650 MB.

Are you maybe dumping this file in 'insert' mode?
--
Mike Nolan

On 9/21/06, Charles Ambrose <jamjam360@gmail.com> wrote:
Hi!

I have a fairly large database tables (say an average of  3Million to 4Million records).  Using the pg_dump utility takes forever to dump the database tables. As an alternative, I have created a program that gets all the data from the table and then put it into a text file. I was also unsuccessfull in this alternative to dump the database.



Re: Backup Large Tables

From
"Charles Ambrose"
Date:
Hi!

I encounter errors in dumping the database using pg_dump. The database i think is corrupt. It was looking for triggers and stored procedures that are now longer in the database. This is also the reason why I opted to create a program to dump the database.

On 9/22/06, Michael Nolan <htfoot@gmail.com> wrote:
I have a table with over 6 million rows in it that I do a dump on every night.  It takes less than 2 minutes to create a file that is around 650 MB.

Are you maybe dumping this file in 'insert' mode?
--
Mike Nolan


On 9/21/06, Charles Ambrose < jamjam360@gmail.com> wrote:
Hi!

I have a fairly large database tables (say an average of  3Million to 4Million records).  Using the pg_dump utility takes forever to dump the database tables. As an alternative, I have created a program that gets all the data from the table and then put it into a text file. I was also unsuccessfull in this alternative to dump the database.




Re: Backup Large Tables

From
Casey Duncan
Date:
Are you dumping the whole database or just a single table? If it's
the former, try the latter and see if you still get errors.

If pg_dump is not working, maybe some system table is hosed. What
errors are you getting?

If you can get in via psql, log in as a superuser and execute:

COPY mytable TO 'mytable.txt';

That will dump the table data to a text file which can be re-imported
into a new database using the COPY FROM command. Basically you're
just doing part of what pg_dump does for you by hand.

-Casey

On Sep 21, 2006, at 9:19 PM, Charles Ambrose wrote:

> Hi!
>
> I encounter errors in dumping the database using pg_dump. The
> database i think is corrupt. It was looking for triggers and stored
> procedures that are now longer in the database. This is also the
> reason why I opted to create a program to dump the database.
>
> On 9/22/06, Michael Nolan <htfoot@gmail.com> wrote: I have a table
> with over 6 million rows in it that I do a dump on every night.  It
> takes less than 2 minutes to create a file that is around 650 MB.
>
> Are you maybe dumping this file in 'insert' mode?
> --
> Mike Nolan
>
>
> On 9/21/06, Charles Ambrose < jamjam360@gmail.com> wrote: Hi!
>
> I have a fairly large database tables (say an average of  3Million
> to 4Million records).  Using the pg_dump utility takes forever to
> dump the database tables. As an alternative, I have created a
> program that gets all the data from the table and then put it into
> a text file. I was also unsuccessfull in this alternative to dump
> the database.
>
>
>
>


Re: Backup Large Tables

From
Vivek Khera
Date:

On Sep 21, 2006, at 10:54 PM, Charles Ambrose wrote:

I have a fairly large database tables (say an average of  3Million to 4Million records).  Using the pg_dump utility takes forever to dump the 

Sounds like your either woefully mis-configured or woefully underpowered or have a short definition of "forever" :-)

Every night we take a dump of our several hundred million row DB in about 49 minutes.  We use the "pg_dump -Fc" format and that comes to a bit over 5GB of data compressed.


Attachment