[Gllug] best way of making backups
pectw.accounts at googlemail.com
Sun Sep 14 12:32:35 UTC 2008
Richard Jones wrote:
> On Sun, Sep 14, 2008 at 10:59:06AM +0100, Nahuel Marisi wrote:
>> At the moment I make my backups (once every three months when I panic about
>> my HD dieing) simply by copying the folders I want to backup and compressing
>> it as a tar.bz2 file.
> Compressing it is a bad idea. The way that gzip & bzip2 work is that
> an error at any point in the file can corrupt the entire remainder of
> the file.
I do use bzip2. While compression errors might be a problem, I back up
the servers nightly into <server>-<date>.tb2 files, so if last night's
was corrupted, I'll try the night before, then the night before that:
the purpose of backups being to expand options when things are
pear-shaped, having a library of rarely-changing backups is an effective
way of pre-empting corruption issues. These backups remain on server,
and are also ftpd off to a machine a short distance away.
Also nightly I'll make tb2s of the mysql database files, which are well
under a meg, encrypt then upload them to a remote site. Both this and
the above are handled through a script started via cron.
Once a week I'll burn the first set of backups to CD (CDR, never CDRW),
then take them physically offsite.
The two best bits of advice I can give are:
1. Have a library of backups
2. Schedule time to wet-run backup restorations; you'll discover
shortcomings in your system before those shortcomings find you out.
Gllug mailing list - Gllug at gllug.org.uk
More information about the GLLUG