To complement my Time Machine backups I wanted to add an rsync backup which is relatively easy from the command line but needs quite some care when bash scripting it.
#!/bin/bash
# Directory to backup
LOCAL=/Users/stuart/
# Location to backup to
REMOTE=192.168.178.41:/volume1/Rsync/MacbookAir
# Array of directory names or patterns to exclude
# IMPORTANT: these paths are relative to the source directory
EXCLUDES=( ".gem" ".config" ".pcloud" "Applications" \
"Downloads" "IDrive-Sync" "*.lrdata" "*.lrlibrary" \
"IDrive Downloads" ".Trash" "Library" \
"pCloud Drive/" "Music" "*public" )
# Run time options
# a for archive which is also recursive
# --dry-run and --delete are for testing
# --log-file is optional and not really useful
OPTIONS="-av --delete --log-file=rsync.log"
# Declare an empty array
exclude_opts=()
# Cycle through the array of exclusions and create a
# new array with "--exclude before each exclusion
for item in "${EXCLUDES[@]}"
do
exclude_opts+=( --exclude "$item" )
done
# call the rsync command, the use of "" and {}
# is exactly as needed, anything else won't work
rsync $OPTIONS "${exclude_opts[@]}" $LOCAL $REMOTE
I added this to a crontab to run at 02:00 every day.
* 2 * * * /User/stuart/bin/rsyncbackup
Of course it doesn't take much to work out that this is not a backup, it's file sync, removing the `--delete`` option from the command line would come closer to real backup but any changes to a file will always overwrite previous versions so there is no history. I have a couple of Raspberry Pi projects running and collecting daily data that I wanted to keep so I extended the above script to backup to two different NAS's and to have all data timestamped. These are not incremental backups but the data volume is low enough that full daily backups is a non issue both in terms of bandwidth and storage.
#!/bin/bash
BACKUP_LOCATION=/tmp
REMOTE=(192.168.178.41 192.168.178.32)
REMOTE_DIR=/volume1/Rsync/$HOSTNAME
FILENAME=$(date +"%Y%m%d-%H%M%S")-$HOSTNAME-backup.gz
DB=/home/stuart/DS18B20
TAR_OPTIONS="-czvf"
BACKUP_DIRS=("/home/stuart")
# exclude the db files as we'll make backups of these to save
EXCLUDES=( "Downloads" ".*" "*.db")
# Make a backup of the databases (there is only one for now)
for data in $DB/*.db
do
sqlite3 $data ".backup '$data.backup'"
done
exclude_opts=()
for item in "${EXCLUDES[@]}"
do
exclude_opts+=(--exclude "$item" )
done
# create the compressed tar file
tar "${exclude_opts[@]}" $TAR_OPTIONS $BACKUP_LOCATION/$FILENAME $BACKUP_DIRS
# Now copy these backups to a NAS
for DEST in ${REMOTE[@]}
do
echo "Copying backup to $DEST"
scp -p $BACKUP_LOCATION/$FILENAME $DEST:$REMOTE_DIR
done
# remove all local backups older than 10 days
/usr/bin/find $BACKUP_LOCATION -type f -mtime +8 -name "*.gz" -delete