[Gllug] best way of making backups

Dennis Furey dennis at basis.uklinux.net
Sun Sep 14 12:59:13 UTC 2008


On Sun, Sep 14, 2008 at 12:18:49PM +0100, Phil Reynolds wrote:
> 
> 
> Quoting "Nahuel Marisi" <nahuelmarisi at gmail.com>: 
> 
> > Dear List :P
> > 
> > At the moment I make my backups (once every three months when I panicabout my HD dieing) simply by copying the folders I want to backup andcompressing it as a tar.bz2 file. Obviously this is not the mostorganised or efficient way of making backups. So I was wondering whatprograms do you use or suggest in order to do this tedious butnecessary task.

I'm surporised no one has mentioned faubackup. I think I learned about
it from a previous thread on this list.

http://faubackup.sourceforge.net/

I use it to make a nightly backup of my whole filesystem hierarchy on an
exernal usb drive. By automatically using hard links, it allows files
that haven't been changed to be shared between consecutive backups, so
it's possible to store up months of daily snapshots without filling up
the disk, and they're accessible just like normal files.

The other thing I do is to make a smaller backup of my home and /etc
directories (about 4.7G, small enough to fit on a DVD) which is stored
remotely on Amazon S3 and sometimes locally on DVDs. It costs me about
a pound a month including storage and bandwidth charges.

There are some inconveniences with S3 such as limits on file sizes,
inability to mount it as an ordinary remote filesystem, its tendency
to drop the connection often, and the insecurity of storing personal
information on a system outside one's own adminstrative control. I
work around these by creating a LUKS encrypted ext2 filesystem image
and mounting it as a loopback device. Then I rsync my home directory
with it, and unmount it and split the raw encrypted image into chunks
of one MB each, which I sync to a bucket on S3 using the s3cmd
utility.

http://s3tools.logix.cz/s3cmd

The syncing is done in a script that retries repeatedly until it
finishes. Some nights it takes ten tries. The script I use is attached
below. Feel free to refactor this since I'm just an amateur shell
scripter. The necessary files and directories have to be created
manually in advance before the first time you use it. It might not
work unless it's run as root. It has to be run interactively because
it prompts you for the LUKS passphrase.

Dennis
-------------- next part --------------
if [[ -e /dev/loop/0 ]]; then
   export LOOPDEVICE=/dev/loop/0
else
   export LOOPDEVICE=/dev/loop0
fi
cd /mnt/x/img
losetup $LOOPDEVICE back.img \
|| { 
   echo "losetup error"
   exit; }
cryptsetup luksOpen $LOOPDEVICE cs3 \
|| {
   echo "cryptsetup error"
   exit; }
mount -t ext2 /dev/mapper/cs3 /media
dpkg --get-selections > /home/dennis/dpkgsel.txt
rsync -avc --delete --delete-excluded \
   --exclude=/home/dennis/.thumbnails \
   --exclude=/home/dennis/.macromedia \
   --exclude=/home/dennis/.xsession-errors \
   --exclude=/home/dennis/.kde \
   --exclude=/home/dennis/.wine \
   --exclude=/home/dennis/.Trash \
   /boot /etc /home /var/mail /var/backups \
   /media \
|| {
   umount /media
   cryptsetup luksClose cs3
   losetup -d $LOOPDEVICE
   cp back.old back.img
   echo "rsync error"
   exit; }
umount /media
cryptsetup luksClose cs3
losetup -d $LOOPDEVICE
while
   split -a 4 -b 1MB -d back.img
   mkdir x
   mv x???? x
   [[ $( wc -c x/* | tail -n 1 ) != "4700000000 total" ]]
do
   rm -r x
   echo "retrying split"
done
while
   ( ! s3cmd --no-delete-removed sync x s3://your-bucket-name 2> err.txt > log.txt ) \
   || [[ -n $( grep WARNING log.txt ) ]] \
   || [[ -s err.txt ]]
do
   cat log.txt >> logs.txt
   cat err.txt >> errs.txt
   sleep 5
   echo "retrying s3cmd"
done
rm -r x
cp back.img back.old
echo "finished remote backup"
date
faubackup --ignore dev --ignore mnt --ignore proc --ignore sys --ignore tmp --ignore media / /mnt/darl \
|| {
   echo "faubackup error"
   exit; }
#
# add this after about a month or when your backup medium starts to get full 
#
# remove the oldest backup
# rm -r /mnt/darl/`ls -r /mnt/darl | tail -n 1`
#
echo "finished local backup"
date
-------------- next part --------------
-- 
Gllug mailing list  -  Gllug at gllug.org.uk
http://lists.gllug.org.uk/mailman/listinfo/gllug


More information about the GLLUG mailing list