After migrating to a new server host I set up a reminder in my reminders app to create a proper backup strategy. This was nagging me every day but I always bumped it to some other time as it's not exactly the most fulfilling task. I wrote about doing backups via Duplicity / Duply before but that setup was always a bit more complicated than it should've been. Especially as it involved GPG for the encryption.

Today I finally sat down to get this done before something bad happens to my data. After doing some research I had to decide between restic and Borg. Both of them looked very promising but in the end I settled on restic as it's written in Go and not Python like Borg is.

Goal

Scheduled backups from my server hosted at OVH to my Synology NAS running in my local network at home.

Setup and preparations

Install restic

I don't really need to explain much about how to do that as it's all well explained in other popular guides like Jake Jarvis' blog post "Automatically Backup a Linux VPS to a Separate Cloud Storage Service". There's also a very easy to follow official documentation which I mostly followed for my setup.

Just install it based on the instructions for your operating system. Run restic version to see if you are on the latest version and then continue with the next step.

SSH configuration

You have to make sure to have a working public key authenticated SSH connection between your backup source and backup target.

In my case I had to create an SSH key on my server and then copy that one to my NAS. This can be done by using these two commands:

ssh-keygen -t ed25519 -o -a 100
sshh-copy-id -i ~/.ssh/id_ed25519.pub -p 57564 username@username.synology.me

If the SSH server on the target is running on a non-standard SSH port make sure to set up a ~/.ssh/config file to set all these parameters as you can't set it in the restic backup command later on. I run the backup as root so all this is done while being logged in as root. If you don't like that there's a section in the documentation explaining how to do just that.

The SSH config file could look like this, the ServerAliveInterval and ServerAliveCountMax parameters were suggested in the forum.

Host username.synology.me
        HostName username.synology.me
        User username
        Port 12345
        ServerAliveInterval 60
        ServerAliveCountMax 240

If you type ssh username.synology.me it should connect via SSH and you should be logged into your NAS without typing a password. This has to work before moving on to the next step.

Create Repository (aka. backup target)

This sets up the directory where the backups are going to be stored at. In my case I did the backups over SFTP to my NAS so make sure you can log into your backup target via SSH / SFTP if you use this strategy. Of course there's also a lot of other backup targets you can use (S3, Backblaze, DigitalOcean etc). I tried to use Google Cloud Storage on my first attempt but couldn't get it to work but you might have more patience than me.

With this command we create the backup repository on the remote host via SFTP:

restic -r sftp:username@username.synology.me:/backup-remote/notmyhostname-2019 init --verbose

After running this you can double check that by logging into your NAS and making sure the directory got created. In the directory should be a config file and a bunch of other standard directories that restic creates.

dewey@alexandria:/volume1/Archive/backup-remote/notmyhostname-2019$ ls -lah
total 4.0K
drwxrwxrwx+ 1 dewey users   66 Jul 18 21:38 .
drwxrwxrwx+ 1 dewey users   98 Jul 18 20:53 ..
-rw-------  1 dewey users  155 Jul 18 21:31 config
drwx------  1 dewey users 1.0K Jul 18 21:29 data
drwx------  1 dewey users  256 Jul 18 22:18 index
drwx------  1 dewey users  128 Jul 18 21:31 keys
drwx------  1 dewey users    0 Jul 18 22:18 locks
drwx------  1 dewey users  384 Jul 18 22:18 snapshots

Warning for Synology usersIf you look closely you see that there's a mismatch between the path on the NAS from the ls -lah output and where the restic command is supposed to create the repository:

/backup-remote/notmyhostname-2019 vs. /volume1/Archive/backup-remote/notmyhostname-2019. You'd think that the repository command would need to look like:

/volume1/Archive/backup-remote/notmyhostname-2019 init --verbose

That's what I thought too but as I later found out and confirmed via the official restic forum (which was very helpful and active) this is a "feature" of Synology where the root directory of a SFTP user is actually the user's home directory. So what would be / for an SSH user is actually /volume1/Archive/ for a SFTP user. I also answered that question in my thread on the forum for other Synology users.

Define includes and excludes

We need to define which files we need to include in or exclude from the backup. I prefer the options of providing both, an inclusion an and an exclusion file. This makes it very explicit which files are supposed to be in the backup.

I created two files for that purpose:

root@notmyhostname:~/.config/restic# cat includes
/etc
/home/ubuntu
root@notmyhostname:~/.config/restic# cat excludes
/home/ubuntu/services/**/deluge-data
/home/ubuntu/.cache

How the rules work is all defined in the documentation.

First backup

Every time you run a restic command you have to provide the repository path and the password via an environment variable or command line flag. To make this a bit less annoying I'd suggest you create a file like backup.sh and export the variables there before running the actual command.

After creating the file with the following content just make it executable with chmod +x backup.sh and run it with ./backup.sh.

#!/bin/bash

export RESTIC_REPOSITORY="sftp:username@username.synology.me:/backup-remote/notmyhostname-2019"
export RESTIC_PASSWORD="changeme"


restic backup --verbose --files-from /root/.config/restic/includes --exclude-file=/root/.config/restic/excludes

This will create your first backup and if everything is working the only thing you have to do is to run it via a cronjob.

Automate

Add this to your /etc/crontab file and your backup will run at the given interval. I'm logging errors to a file but you can also use a script to send an email, push notification or whatever you prefer.

0 */12  * * *   root    /root/backup.sh 2>> /var/log/restic.log

Prune old backups

As we don't want to keep the entire backup history we can clean up old backups after a while. The easiest way is to add the forget command at the end of your backup.sh:

restic forget --verbose --prune --keep-hourly 6 --keep-daily 7 --keep-weekly 4 --keep-monthly 12

This should be all that's needed. To make sure it works perform a restore of some example files by following this step.

Check successful backups

By running the restic snapshots command you can double check the backup history, as we see here the backups are created every 12h just like we defined it in the cron job.

root@notmyhostname:~# RESTIC_REPOSITORY="sftp:username@username.synology.me:/backup-remote/notmyhostname-2019" RESTIC_PASSWORD="changeme" restic snapshots
repository 070c204c opened successfully, password is correct
ID        Time                 Host           Tags        Paths
----------------------------------------------------------------------
88eb6cda  2019-07-18 19:36:01  notmyhostname              /etc

1e0dda19  2019-07-18 20:17:47  notmyhostname              /etc
                                                          /home/ubuntu

988c8b69  2019-07-18 21:37:22  notmyhostname              /etc
                                                          /home/ubuntu

c713b4ad  2019-07-19 00:00:02  notmyhostname              /etc
                                                          /home/ubuntu

651853b0  2019-07-19 12:00:01  notmyhostname              /etc
                                                          /home/ubuntu

56571ab6  2019-07-20 00:00:01  notmyhostname              /etc
                                                          /home/ubuntu

5a4b7500  2019-07-20 12:00:01  notmyhostname              /etc
                                                          /home/ubuntu
----------------------------------------------------------------------
7 snapshots
root@notmyhostname:~#

If you have any questions or suggestions, let me know via Twitter.