Thread: Backup done by newbie

Backup done by newbie

From
Gunnar Lindholm
Date:
Hello.
I wonder if there is anyone who has administrated a "real" database that could
tell me something about backup of the database, what tools to use, how to
acctually do the backup, what backup hardware to use,  and so on.

The reason for asking this is that I might have to do a backup on a database
soon, and the only idea I have right now is to dump the database and make a tar
backup of the dumped data.

The database is not that big, It's about 500k lines in the dump, that's
about <20MB of data I guess. (The real problem here is probably that it's
running on a totaly undocumented informix database from the 80's. I'll try to
switch to postgres some day...)

So,  how would you guys do it? I'm quite interested in how you do it on a big
database. I mean, just to dump this database took me more than 2 hours, (the
hardware is a x86 running at 300MHz with 64MB Ram and IDE discs, so it would
probably go alot faster with some "real" hardware), but what should I do when I
get 20/2*24*7=1,7 GB of data (not to unrealistic though), i.e. it would take a
week just to dump the database? Could it be possible to just make a snapshot of
the database on weekends and during the weeks, just doing "incremental backup"
i.e. log the transactions?

I would be very happy if you would like share your experience.
Gunnar.

Re: Backup done by newbie

From
Ragnar Kjørstad
Date:
On Wed, Sep 06, 2000 at 08:41:44PM +0200, Gunnar Lindholm wrote:
> Hello.
> I wonder if there is anyone who has administrated a "real" database that could
> tell me something about backup of the database, what tools to use, how to
> acctually do the backup, what backup hardware to use,  and so on.

What OS?

> The reason for asking this is that I might have to do a backup on a database
> soon, and the only idea I have right now is to dump the database and make a tar
> backup of the dumped data.

Can you afford stopping the database?
long enough to make a snapshot?
long enough to copy the data?

On linux, you can use LVM to create snapshots of devices. This enables
you to just shut down the database for seconds while you snapshot. Other
OSes have simular features.

You can also try just copying the data while the server is running, but
there is a chance the data will be corrupted if it gets written to while
you take your backup.

> probably go alot faster with some "real" hardware), but what should I do when I
> get 20/2*24*7=1,7 GB of data (not to unrealistic though), i.e. it would take a
> week just to dump the database? Could it be possible to just make a snapshot of
> the database on weekends and during the weeks, just doing "incremental backup"
> i.e. log the transactions?

Snapshotting the device is probably the best way.

Another idea would be to duplicate all IO (either on SQL level or
systemcall level) to another machine, and build some buffer into this
system so you can stop the writing to your secondary server when you are
running your backups.


--
Ragnar Kjørstad

Re: Backup done by newbie

From
"Gavin M. Roy"
Date:
pg_dumpall >outfile and then backup the outfile.  dumping on my p3-500 with over 1
gig of data takes a minimal amount of time (5-10 minutes) as of my last check.  I
backup the file to tape using tar cvfy /dev/st0 /usr/local/pgsql/outfile or the
equiv.  I have had very good experiences with this.  Another way is to write a
script to pull the dbnames and back up each one individually.

Gavin

Gunnar Lindholm wrote:

> Hello.
> I wonder if there is anyone who has administrated a "real" database that could
> tell me something about backup of the database, what tools to use, how to
> acctually do the backup, what backup hardware to use,  and so on.
>
> The reason for asking this is that I might have to do a backup on a database
> soon, and the only idea I have right now is to dump the database and make a tar
> backup of the dumped data.
>
> The database is not that big, It's about 500k lines in the dump, that's
> about <20MB of data I guess. (The real problem here is probably that it's
> running on a totaly undocumented informix database from the 80's. I'll try to
> switch to postgres some day...)
>
> So,  how would you guys do it? I'm quite interested in how you do it on a big
> database. I mean, just to dump this database took me more than 2 hours, (the
> hardware is a x86 running at 300MHz with 64MB Ram and IDE discs, so it would
> probably go alot faster with some "real" hardware), but what should I do when I
> get 20/2*24*7=1,7 GB of data (not to unrealistic though), i.e. it would take a
> week just to dump the database? Could it be possible to just make a snapshot of
> the database on weekends and during the weeks, just doing "incremental backup"
> i.e. log the transactions?
>
> I would be very happy if you would like share your experience.
> Gunnar.