Thread: Re: Backup
Joseph M. Day wrote: >> I am trying to convert a MSSQL DB thats roughly 45GB, to Postgres. I >> am >> trying to find the equivalent of Full / Incremental / Differential >> backups. It looks like pg_dump is the equivalent of a full backup, but >> how do I keep the equivalent of a Incremental, or Differential backup. >> >> To keep the same functionality, I will need a full backup once a week, >> and a differential once a day. >> >At this time, there is nothing less than full backups. -- Thanks Naomi. Is there anybody using this for large DB's? Anything larger than a couple hundred GB, it would be impractical to make full backups everyday. Anyone out there using this for large DB's, if so I'd love to hear how they are dealing with backups. Joe,
If you are running version 8, you may want to look into the WAL archiving: http://www.postgresql.org/docs/8.0/interactive/backup-online.html That would form the basis of your incremental backup strategy. I haven't had the joy of using them myself however, and donot know of any tools to make it more convenient, and a quick breeze through pgfoundry didn't find much either. -----Original Message----- From: pgsql-admin-owner@postgresql.org [mailto:pgsql-admin-owner@postgresql.org]On Behalf Of Joseph M. Day Sent: Wednesday, March 23, 2005 3:02 PM To: 'Naomi Walker' Cc: pgsql-admin@postgresql.org Subject: Re: [ADMIN] Backup Joseph M. Day wrote: >> I am trying to convert a MSSQL DB thats roughly 45GB, to Postgres. I >> am >> trying to find the equivalent of Full / Incremental / Differential >> backups. It looks like pg_dump is the equivalent of a full backup, but >> how do I keep the equivalent of a Incremental, or Differential backup. >> >> To keep the same functionality, I will need a full backup once a week, >> and a differential once a day. >> >At this time, there is nothing less than full backups. -- Thanks Naomi. Is there anybody using this for large DB's? Anything larger than a couple hundred GB, it would be impractical to make full backups everyday. Anyone out there using this for large DB's, if so I'd love to hear how they are dealing with backups. Joe, ---------------------------(end of broadcast)--------------------------- TIP 6: Have you searched our list archives? http://archives.postgresql.org
> Is there anybody using this for large DB's? Anything larger than a > couple hundred GB, it would be impractical to make full backups > everyday. Anyone out there using this for large DB's, if so I'd love to > hear how they are dealing with backups. You might want to have a look at this: http://www.postgresql.org/docs/8.0/static/backup-online.html -j -- Jay A. Kreibich | CommTech, Emrg Net Tech Svcs jak@uiuc.edu | Campus IT & Edu Svcs <http://www.uiuc.edu/~jak> | University of Illinois at U/C
Hi, I have a 61Gig base at the moment and do a full online backup each night. It's not really that much of a strain so I haven't bothered with cooking up a scheme for differential backups. Using my simple scripts it takes one hour and in my case I end up with 2.5Gigs (compressed) worth of backup files. The backup claims two cpu's for the hour that the job runs but on a multi-cpu box it's not that much trouble. My scripts are: #! /bin/sh if test $# -lt 2; then echo "Usage: dbbackup <basename> <filename>" else /home/postgres/postgresql/bin/pg_dump -h $HOSTNAME $1 | gzip -f - | split --bytes 600m - $2. fi and if test $# -lt 2; then echo "Usage: dbrestore <basename> <filename>" else cat $2.* | gzip -d -f - | /home/postgres/postgresql/bin/psql -h $HOSTNAME -f - $1 fi Cheers, John >>> "Joseph M. Day" <jday@gisolutions.us> 03/23/05 8:41 PM >>> It looks like pg_dump is the equivalent of a full backup, but how do I keep the equivalent of a Incremental, or Differential backup. To keep the same functionality, I will need a full backup once a week, and a differential once a day.