Managing Encrypted Backups in Linux: Part 1
Encrypted backups are great, but what if something goes wrong and you can't read your encrypted files? In this two-part series, I’ll show how to use rsync and duplicity as your belt-and-suspenders protection against data loss. Part 1 shows how to create and automate simple backups. In part 2, I'll go into more details on file selection and backing up encryption keys.
My personal backup plan uses both encrypted and unencrypted backups, because my paranoia extends to worrying about broken encryption. If something goes wrong, like a corrupted file system, good luck recovering encrypted data.
I want to always have access to my files. My risk assessment is pretty simple: The most likely cause for losing access is file system corruption or hardware failure. Other possible, but less likely, hazards are fire or theft. I don't need to encrypt files on my main PC (though in a future installment, I'll look at backup options for encrypted volumes). My most important files are encrypted and uploaded to remote servers. I'm stuck with capped mobile broadband, so I can't just encrypt and stuff everything into a remote server.
- Nightly unencrypted dump of everything to a portable hard drive.
- Nightly selective encrypted dump to a remote server.
- Continuous encrypted upload to SpiderOak of my most important files.
- Weekly rotation of unencrypted drives to my bank safety deposit box.
I rotate two unencrypted portable hard drives to my safe deposit box, so if my house ever burns down I'll lose, at most, a week's worth of files. Sometimes I dream of losing the whole lot; what do I need all that junk for? But, I save it anyway.
Simple Unencrypted Backups
Simple unencrypted backups are easy with good old rsync. Hard drives are huge and cheap, so it is feasible to back up everything on my PC every night. I use an rsync exclude file to avoid copying crud I know I'll never need, such as some dotfiles and certain directories. This is a brief example of an exclude file:
.adobe/ .dbus/ .macromedia/ .Xauthority .xsession-errors downloads/ Videos/
This command performs the backup -- remember here to mind your trailing slashes. A trailing slash on the source directory copies only the contents of the directory, and not the directory. No trailing slash copies the directory and its contents. The file paths in your exclude file are relative to the directory you are copying:
$ rsync -av "ssh -i /home/carla/.ssh/backup_rsa" --exclude-from=exclude.txt \ /home/carla/ carla@backup:/home/carla/
I use passphrase-less SSH key authentication to login to my remote server. A passphrase-less key is not vulnerable to a brute-force password attack.
The rsync command goes into a script, ~/bin/nightly-plain:
#!/bin/bash rsync -av "ssh -i /home/carla/.ssh/backup_rsa" --exclude-from=exclude.txt \ /home/carla/ carla@backup:/home/carla/
Remember to make it executable and limit read-write permissions to you only:
$ chmod 0700 nightly-plain
I added ~/bin/ permanently to my path by adding these lines to my ~.profile:
# set PATH so it includes user's private bin if it exists if [ -d "$HOME/bin" ] ; then PATH="$HOME/bin:$PATH" fi
Putting a directory in your path means you can call scripts in that directory without having to spell out the full path. Create your personal cron job like this example, which runs every night at 11:05 PM:
$ crontab -e 05 23 * * * nightly-plain
Encrypted Backups with duplicity
duplicity goes to work 30 minutes later. Of course, you can adjust this interval to fit your own setup.
Ubuntu users should install duplicity from the duplicity PPA, because the version in Main is old and buggy. You also need python-paramiko so you can use SCP to copy your files.
duplicity uses GPG keys, so you must create a GPG key:
$ gpg --gen-key
It is OK to accept the defaults. When you are prompted to enter your name, email address, and comment, use the comment field to give your key a useful label, such as "nightly encrypted backups." Write down your passphrase, because you will need it to restore and decrypt your files. The worst part of creating a GPG key is generating enough entropy while it is building your key. The usual way is to wiggle your mouse for a couple of minutes. An alternative is to install rng-tools to create your entropy. After installing rng-tools, open a terminal and run this command to create entropy without having to sit and wiggle your mouse:
$ sudo rngd -f -r /dev/random
Now create your GPG key in a second terminal window. When it is finished, go back to the rngd window and stop it with Ctrl+C. Return to your GPG window and view your keys with gpg --list-keys:
$ gpg --list-keys pub 2048R/30BFE75D 2016-07-12 uid Carla Schroder (nightly encrypted backups) <firstname.lastname@example.org> sub 2048R/6DFAE9E8 2016-07-12
Now you can make a trial duplicity run. This example encrypts and copies a single directory to a server on my LAN. Note the SCP syntax for the target directory; the example remote somefiles directory is /home/carla/somefiles. SCP and SSH paths are relative to the user, so you don't need to spell out the full path. If you do you will create a new directory. Use the second part of the pub key ID to specify which GPG key to use:
$ duplicity --encrypt-key 30BFE75D /home/carla/somefiles \ scp://carla@backupserver/somefiles
A successful run shows a bunch of backup statistics. You can view a file list of your remote files:
$ duplicity list-current-files \ scp://carla@backupserver/somefiles
Test your ability to restore and unencrypt your files by reversing the source and target directories. You need your passphrase. This example decrypts and downloads the backups to the current directory:
$ PASSPHRASE="password" duplicity \ scp://carla@backupserver/somefiles .
Or, restore a single file, which in this example is logfile. The file's path is relative to the target URL, and the directory that you restore it to does not have to exist:
$ PASSPHRASE="password" duplicity --file-to-restore logfile \ scp://carla@backupserver/somefiles logfiledir/
If you're encrypting and backing up a single directory like the above example, you can put your duplicity command in a script, and put the script in a cron job. In part 2, I'll show you how to fine-tune your file selection.
SpiderOak for Continual Encrypted Backups
I use SpiderOak to encrypt and upload my most important files as I work on them. This has saved my day many times from power outages and fat-fingered delete escapades. SpiderOak provides zero-knowledge offsite encrypted file storage, which mean that if you lose your encryption key, you lose access to your files, and SpiderOak cannot help you. Any vendor that can recover your files means they can snoop in them, or get hacked, or give them to law enforcement, so zero-knowledge is your strongest protection.
Come back for part 2 to learn more about file selection and backing up your keys.
Learn more skills for sysadmins in the Essentials of System Administration course from The Linux Foundation.