If you are looking for an easy, reliable and free server backup solution for LAMP server, database servers or any other Linux server, you came to the right place.
The server backup solution I have been using for few years now is based on duplicity project. Below diagram shows my current backup schema to ensure that I have a copy of backup at both onsite and offsite locations.
Few points to notice:
- Linux workstation is the computer for which will be backed up.
- Local NFS server is used to store onsite copy of server backups.
- A copy of backup is also stored on Amazon S3 cloud for disaster recovery. Duplicity supports backing up to Amazon S3, Azure, Dropbox, Google Docs, Google Cloud Storage, OneDrive and few others directly by utilizing s3cmd. But I prefer to use Duplicity to backup at onsite location and then utilizing s3cmd to sync the onsite backup to cloud storage.
NOTE: This tutorial is was created for Ubuntu platform (14.04 to be specific) but it can be easily adopted for any other Linux distribution.
Let’s get started with real stuff!
Preparing the Server Backup Environment
- Install the required packages to implement the solution.
1sudo apt-get install s3cmd duplicity python-magic - To make the things easy, we will be using bash wrapper for duplicity developed and maintained by zertin (https://github.com/zertrin/duplicity-backup). Download and extract the duplicity-backup files to local directory.
12wget https://github.com/zertrin/duplicity-backup/archive/master.zipunzip master.zip - Unzipping the downloaded file will extract the data in duplicity-backup-master directory. Copy the script and configuration file to system directory.
123cp duplicity-backup-master/duplicity-backup.sh /usr/binchmod +x /usr/bin/duplicity-backup.shcp duplicity-backup-master/duplicity-backup.conf.example /etc/duplicity-backup.conf - Edit the configuration file /etc/duplicity-backup.conf according to your needs and setup. Below is the configuration file I am using.
1234567891011121314151617181920212223242526272829303132#!/bin/bashROOT="/"DEST="file:///backups/"INCLIST=( "/home/webmaster/db" \"/etc/apache2" \"/var/www/" \"/var/lib/mysql" \)GPG_ENC_KEY="1L23SID2"GPG_SIGN_KEY="IOQ1A2HL"STATIC_OPTIONS="--full-if-older-than 14D"CLEAN_UP_TYPE="remove-all-but-n-full"CLEAN_UP_VARIABLE="4"REMOVE_INCREMENTALS_OLDER_THAN="4"LOGDIR="/home/webmaster/logs"LOG_FILE="duplicity-`date +%Y-%m-%d_%H-%M`.txt"LOG_FILE_OWNER="webmaster:webmaster"REMOVE_LOGS_OLDER_THAN='30' # (days) uncomment to activateVERBOSITY="-v3"EMAIL_TO="webmaster@dhillonblog.com"EMAIL_FROM="noreply@dhillonblog.com"EMAIL_SUBJECT="Automated backup alerts - LAMP Server"MAIL="mailx" # default command for Linux mail - I have created /backups directory which is mount point for NFS share. You can have a USB mount point or even a secondary disk for this.
MySQL Server Backup
Even though I take backup of /var/lib/mysql directory where physical MySQL database files are stored. It is possible to restore the database from physical files but it’s not an easy process and is not a recommended way (If you have to restore selective database on other server).
I use following script to take backup of all the databases using mysqldump utility and run this script before running the duplicity-backup.sh. Copy the following code and save it as /usr/bin/dbbackup.sh
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
#! /bin/bash MYSQL_BIN="$(which mysql)" MYSQLDUMP_BIN="$(which mysqldump)" BACKUP_DIR=/home/webmaster/db MYSQL_USER=root MYSQL_PASS=mypassword mkdir -p $BACKUP_DIR databases=`${MYSQL_BIN} --user=${MYSQL_USER} --password=${MYSQL_PASS} -e "SHOW DATABASES;" | grep -Ev "(Database|information_schema|performance_schema)"` for db in $databases; do ${MYSQLDUMP_BIN} --force --opt --user=${MYSQL_USER} --password=${MYSQL_PASS} --databases $db | gzip -cq > "$BACKUP_DIR/$db.gz" done |
Set the executable permission on this script suing chmod command.
1 |
chmod +x /usr/bin/dbbackup.sh |
Above script will create individual backup file for each SQL database. You can replace root user and password with some other user that has read access to all the databases.
Syncing Backup files to Cloud Storage
As outlined in starting of this article, duplicity has support for directly backing up to various cloud storage platforms but in backup plan to local NFS server and then use s3cmd to sync the backup to cloud storage. This section outlines the setup for configuring s3cmd to sync local server backup files to Amazon S3 platform.
- Run s3cmd in configuration mode to generate config data that will be stored in .s3cfg.
1s3cmd --configure
Follow the command line wizard to provide Amazon S3 credentials and bucket related details. Below is the screenshot of the wizard with all the parameters.Once you have provided and test is successful, s3cmd will run a test to fetch the bucket list from Amazon S3. If it is successful, you will be prompted to save the config file.
- Once s3cmd is configured, perform a test/dry run to make sure the backup to cloud storage will work without any problem using following command.
1s3cmd sync --dry-run --skip-existing --delete-removed /backups s3://my_bucket_name/backup_subdir/
Make sure you end the Amazon S3 bucket path with / otherwise you will get an error message.
Putting it all together
With all the components in place, let’s take a server backup manually and then work on automating/scheduling the backup.
- Take backup of all the databases on server
1/usr/bin/dbbackup.sh
This will create a backup file for each of your database in /home/webmaster/db directory. - Now take a backup of file-system using duplicity.
1sudo duplicity-backup.sh -c /etc/duplicity-backup.conf -f
This will take a full backup of directories as per the config file and store the backup in /backups directory. - Sync the backup to Amazon S3 cloud using following command.
1s3cmd sync --skip-existing --delete-removed /backups s3://my_bucket_name/backup_subdir/
This command will skip any existing file that already exist in Amazon S3 cloud. It will also delete any backup files that have been removed from local backup as per duplicity configuration automatically. - You can validate the results using your browser and visiting Amazon S3 cloud bucket.
Automating the Server Backups
Using a crontab editor, schedule the following jobs.
1 2 |
00 2 * * 1-6 dbbackup.sh && duplicity-backup.sh -c /etc/duplicity-backup.conf -b && s3cmd sync --skip-existing --delete-removed /backups s3://my_bucket_name/backup_subdir/ 00 2 * * 7 dbbackup.sh && duplicity-backup.sh -c /etc/duplicity-backup.conf -f && s3cmd sync --skip-existing --delete-removed /backups s3://my_bucket_name/backup_subdir/ |
First job is executed every 2 A.M. from Monday to Saturday and will take an incremental backup. Second job is executed on every Sunday at 2 A.M. and will take full backup. You can tweak the timings are per your requirement or backup policy.
If you have configured the mail section properly in /etc/duplicity-backup.conf, you should be getting an email from your server on every morning when duplicity takes the server backup.