We use cookies to personalize content and ads, to offer features for social media and to analyze the access to our website. We also share information about your use of our website with our social media, weaving and analytics partners. Our partners may combine this information with other information that you have provided to them or that they have collected as part of your use of the Services. You accept our cookies when you click "Allow cookies" and continue to use this website.

Allow cookies Privacy Policy

Automated Server Backups

I came across a lot of weird server backup software with design and UX from the 90th coupled with hundred of options. So I decided to create this simple script to do the backups of files and databases.

Server

The Mission

  1. Weekly backups of folder and mysql databases on a ubuntu server
  2. Email notification for finished backups 
  3. Easy way to download the files

Steps:

  1. Access server and create script
  2. Define what and where to backup
  3. Automate with crontabs
  4. Sending email notification
  5. Download server backup

Prerequisites:

  1. SSH access
  2. Installed tarmysqldumpcrontabs and mail on server
  3. Basic knowledge in using a CLI

1. Access server and create script

Let's login into your server and create a script for your backups. To make it easy we are using the user root. For production system you should create an extra user with access to only these files and folders you want to include.

Login with SSH

$ ssh root@111.222.333.444
root@111.222.333.444's password:

After successful login you might see:

Welcome to Ubuntu 16.04.2 LTS (GNU/Linux 4.4.0-109-generic x86_64)
 
 
root@host:~#
You are now in the home folder of the root user.

Create a new script file

$ vi backup.sh

To start editing mode in VIM you can press i. You file should look like the following:

#!/bin/bash
echo Hello

Save file with : + w + Enter and close with : + q + Enter. Make the script executable and test it.

$ chmod +x backup.sh
$ ./backup.sh
Hello

If you got stuck in VIM here is a list of helpful commands.

2. Define what and where to backup

backup.sh of this repository looks larger than it is. 90% are echos to generate a useful email notification. Let's cut out all the non important stuff and walk through the script.

What to Backup ?

Define paths to folders and databases you want to include in the backup. Most of the time you want to backup a running application, so you should include the app sources and the database. Don't include the whole file system -> You will get problems with disk space ;-)

# backup folders
backup_files="/var/www/my-website.de /var/www/wordpress /etc"
# backup databases
backup_databases="mywebsite wordpress"

We only include sources we really need in case of an emergency. We are using GIT for our projects, so there is no need to include the source code in the server backups. But upload folders, databases, config files, php settings, ssl certificates are the import things to think of.

Where to backup ?

Define a destination folder for your backup files. The folder should be placed somewhere on your system where it is persisted. (If you don't use root user you have to be sure that you have access to that location and btw that user also needs access to the files you want to back up.)

# where to backup
dest="/mnt/backup"
# create backup folder if not exist 
mkdir -p $dest 
# create archive filenames 
day=$(date +%y-%m-%d) 
hostname=$(hostname -s) 
archive_file="$hostname-$day.tar" 
mysql_file="$hostname-mysql-$day.tar" 
# print start status message 
echo "Backing up $backup_files to $dest/$archive_file ..." 
echo "Backing up $backup_databases to $dest/$mysql_file ..."

Backup folder with tar

This is actually the magic command which backups your system.

# backup the files using tar.
tar czvfP $dest/$archive_file $backup_files

Confused by czvfP ?

  • -c create a new archive
  • -z filter the archive through gzip
  • -v verbosely list files processed
  • -f use archive file or device ARCHIVE
  • -p extract information about file permissions

Type $ man tar or visit the docs for more information.

Backup MariaDB databases with mysqldump

# database dump in temp file
mysqldump --user root --routines --triggers --single-transaction --databases $backup_databases > "$dest/sql_dump.sql"
 
# pack the sql dump with tar and remove dump
tar czfP $dest/$mysql_file "$dest/sql_dump.sql" 
rm $dest/sql_dump.sql
 
# print end status message
echo "Backup SUCCESS"
 
# echo generated files
ls -lh $dest

Save the file and test it with

$ ./backup.sh

If everything went well your backup files are now located in your defined $dest path.

$ ls -la /mnt/backups/

The full script is available here

3. Automate with crontabs

With crontab we can automate the script execution. If you have never used crontab just use one of my examples or read through the docs. To make it short: There is a file in where you place line by line jobs which will then be executed defined by parameters.

A job has the following structure:

* * * * * <command>
| | | | | |------------------ command to execute
| | | | |-------------------- day of the week (0-7) 0 and 7 is sunday
| | | |---------------------- month of the year (1-12)
| | |------------------------ day of the month (1-31)
| |-------------------------- hour (0-23)
|---------------------------- minute (0-59)

Add a job

Use the below command to add or update job in crontab. It opens the crontab file where a job can be added/updated.

$ crontab -e

Let's add a basic job which runs every 5 minutes.

*/5 * * * * /bin/sh backup.sh

Press ctrl + O + Enter to save the file and ctrl + X to close the crontab window. Relax for 5 minutes and see what happened ;)

List all crontabs

$ crontab -l

Crontab examples

Running a full backup every 5 minutes might be not a good approach. Here are some examples you could use:

# every day at 3 a.m.
0 3 * * * /bin/sh backup.sh
 
# every day at 3 a.m. and 4 p.m
0 3,16 * * * /bin/sh backup.sh
 
# every sunday at 5 a.m.
0 5 * * 0 /bin/sh backup.sh
 
# every sunday and friday at 3 a.m.
0 3 * * sun,fri  /bin/sh backup.sh
 
# every 6 hours
0 */6 * * * /bin/sh backup.sh

Save log output in file

To store output in a log file:

*/1 * * * * /bin/sh backup.sh >>backup.log

4. Sending email notification

If you want to get informed if a new backup is available you can use mail to send you an email.

*/1 * * * * /bin/sh backup.sh | mail -s "NEW BACKUP - Your server" -a "Your server Backup Scheduler <backup@yourserver.de>" your@company.com

Be sure that you are using the right option parameters for the installed mail version or even application.

# other mail up using -r to set a sender
*/1 * * * * /bin/sh files/backup.sh | mail -s "[BACKUP] Your Server" -r "Your server Backup Scheduler <backup@yourserver.de>" your@company.com

5. Download backup

In our script we already gave a hint about how to download the files. You might give someone else an access with an extra user and he or she can download the files with SFTP client or the CLI. While backups are stored under /mnt/backups you can create a symlink from users home to that location.

# logged in with root@111.222.333.444
# will fail if symlink exists already
$ ln -s /mnt/backup backups 
 
# to create or update a symlink
$ ln -sf /mnt/backup backups</em>

Logout of your server and try to download the files with one line

# on you local machine
$ cd ~/Downloads
$ scp root@111.222.333.444:backups/host-mysql-18-03-25.tar backup

Unpack the downloaded file

$ tar -xvzf backup/host-mysql-18-03-25.tar

Enhancements

Delete old backup files

What to do with old back up files? You may don't need them anymore. If you run jobs on a daily basis you will hit the disk space limit soon. You could include a "old-file-deleter" in your script. Let's say we want to delete all files which are older than 14 days.

# place at the end of backup.sh
find /mnt/backup -mtime +14 -type f -delete
  • /mnt/backup to search in
  • -mtime +14 older than 14 days
  • -type f only files
  • -delete no surprise. Remove it to test your find filter before executing the whole command

Wrapping the script

Some webhoster only let you trigger jobs via an UI. If so, they won't accept your statement with ... | mail .... Wrap the whole command into a backup-cron.sh file:

/bin/sh backup.sh | mail -s "NEW BACKUP - Your server" -a "Your server Backup Scheduler <backup@yourserver.de>" your@company.com

In the webhoster's interface you can then just trigger the backup-cron.sh file.

Download alternatives

When it's still on the same machine, it isn't a backup. So here are some approaches how to download the backup. These cmd could also be executed on a remote machine with crontabs.

Download with rsync

# syncs everything from backups to local-backups $ rsync -a -v root@111.222.333.444:backups/ local-backups/

Only download the latest file(s) with scp

# create server var $ server=root@111.222.333.444
# folder destination (server)
$ from=/backups
 
# folder to store backup
$ to=local-backups
 
# download only latest 2 files (latest two, because it's the backup and db export)
$ scp $server:$from/$(ssh $server "ls -t $from | head -2") $to

Download with SFTP

$ sftp root@111.222.333.444
get -r backups local-backups

Thoughts:

  • Currently we are generating the file names for our server backup with the current date. This could be a problem if you are running the script more than once a day. Currently the file will be overwritten.
  • What would this script look like for a windows server backup?

Links:

Full script available: https://github.com/zauberware/automated-server-backups

Share this article:

zauberware logo