Bash backup script to Dropbox

In order to backup some data from a small VPS, I created a Bash script that compresses the www-data and the databases and uploads them to Dropbox. It runs daily via a cronjob and adds the date to the filename so it won't overwrite the file each day.

It's quite cool actually and it works like a charm for small websites. To upload the files to Dropbox I'm using the Dropbox Uploader library. It's easy to set up and does the job well.

Here's the Bash script that runs via a cronjob:

#!/bin/bash

function backupFiles()
{
  echo "Start backup files"

  if [ -d "/var/www/" ]; then
    tar -zcf "www-$(date '+%F').tar.gz" -C / var/www
  fi
}


function backupDb()
{
  echo "Start backup databases"

  mysqldump -u root -p****** --all-databases | gzip > "db-$(date '+%F').sql.gz"
}

function uploadBackups()
{
  echo "Start uploading"

  ~/bin/Dropbox-Uploader/dropbox_uploader.sh upload *.gz /
}

function deleteBackups()
{
  echo "Delete temporary files"

  rm *.gz
}

echo `date`

backupFiles
backupDb
uploadBackups
deleteBackups

echo "$(tput setaf 2) Backup task has finished"

I'm not writing Bash scripts on a daily basis so the script can probably be improved in many ways. For me it was essential to use gzip and to store the database backups separately from the file backups.

This is the cronjob, running every day at 2am:

0 2 * * * /home/user/bin/backup/backup.sh > /home/user/bin/backup/cron-output.log 2>&1

And the final result: Image

comments powered by Disqus