Setting up Duplicity with GnuPG

Setting up Duplicity with GnuPG

Really enjoy the functionality of Duplicity. On CentOS:
sudo yum install duplicity
If you get "No package duplicity available.", you need to install EPEL. For CentOS 6:
Then try yum again.
To make a key using GnuPG:
gpg --gen-key
The defaults are fine. When the key is complete, make sure you copy down the key (made bold) because you'll pass it to duplicity:
gpg: checking the trustdb
gpg: 3 marginal(s) needed, 1 complete(s) needed, PGP trust model
gpg: depth: 0 valid: 1 signed: 0 trust: 0-, 0q, 0n, 0m, 0f, 1u
pub 2048R/12345678 2012-01-26
.....
You might need to export the key if another user will use it. In my case, I had to create the keys with one user but another user would execute the backups.
gpg --output secret --export-secret-keys
gpg --output public --export
Then the other user needs to:
gpg --import /path/to/secret
gpg --import /path/to/public
You can verify the keys are there by:
gpg --list-keys
If when using the key you get these errors:
gpg: : There is no assurance this key belongs to the named user
gpg: [stdin]: sign+encrypt failed: Unusable public key
You should (as the user experiencing this error):
gpg --edit-key [key]
> trust
// decide how much to trust it
> save
Now to actually use duplicity, it'll most likely be cron'd so a shell script would work nice. I like the way Justin Hartman did it so there's really no need to re-invent what he did. Just ignore the AWS stuff if you're not backing up there.

Recently, we covered syncing files to Amazon S3 using the S3 Tools. This time around, we're going to take a look at another handy tool for making backups, Duplicity. While Duplicity supports S3, it also supports a number of other services that S3 Tools do not. If the command line seems like a hassle, don't worry: the Deja-Dup front-end works really well for simple backups.

The previous tutorial, on syncing files to S3 with S3 Tools, covered what you need to know about getting files to and fro with S3. In general, I'd recommend sticking with S3 Tools for S3, so here I'll stick with other sync methods for Duplicity.

What other sync methods, you ask? (I was hoping you'd ask that.) Duplicity supports syncing files locally between filesystems, SSH/SCP, rsync, FTP, WebDAV, Tahoe-LAFS, and Amazon S3 to name just a few. So if you prefer to sync to a remote server that you control, then Duplicity is a great choice.

Getting started with Duplicity is easy. If you're using a major Linux distribution like Ubuntu, Linux Mint, Debian, Fedora, and so forth, you should be able to find a pre-compiled package for Duplicity in the repositories.

Using Duplicity

One of Duplicity's features is that can encrypt backups. In fact, that's the default. You'll need a Gnu Privacy Guard (GPG) key to use encryption. To create a key, use gpg --gen-key, and follow the prompts. Note that you'll probably need to bash on the keyboard a bit to "create entropy" (haven't you always wanted to create entropy?).

If you don't want encryption, you can use the --no-encryption option.

Let's look at how to run a simple backup. To make a backup of your Document directory (under ~/home/user/Documents) to a remote server, you'd use:

duplicity ~/Documents scp:// This email address is being protected from spambots. You need JavaScript enabled to view it. /home/user/backup

By default, Duplicity will perform a full backup. If you run it again, Duplicity will then do a partial backup (if needed). The prefix scp indicates that you'll be using SCP. If you want to use a different mode, change the prefix. For instance, if you want to use FTP, you'd use ftp:// and so forth.

To restore a backup, you'll want to use the same syntax, but reverse the order. The remote server is first, and the local directory is second - like so:

duplicity scp:// This email address is being protected from spambots. You need JavaScript enabled to view it. /home/user/backup ~/new-folder

In most cases you want to specify a new directory name rather than backing up over the top of your older file. If you know the file you want to restore – rather than an entire backup – then you can use the --file-to-restore option. Of course, this depends on actually knowing the name of the file you wish to restore.

Another handy option with Duplicity is the ability to use the verify command to ensure that a backup succeeded and that the local and remote files are the same:

duplicity verify scp:// This email address is being protected from spambots. You need JavaScript enabled to view it. /home/user/backup ~/Documents

Note that if you used the --no-encryption option with the backups, you also need to use them with the verify command. Yeah, Duplicity should just detect that the remote backup isn't encrypted, but it doesn't.

See the duplicity man page for more on its options and fine-tuning its use. There's a lot you can do with it!

Deja-Dup

As I mentioned, you can also use Deja-Dup as a front-end for Duplicity and it will pretty much take all the complexity out of using Duplicity and backing up files.

The most recent version of Deja-Dup supports SSH, Ubuntu One, WebDAV, Windows Share and some others. It will also support Amazon S3, if you have the python-boto package installed. (At least on Linux Mint, your mileage may vary on Fedora, etc.)

Deja Dup is just called "Backup" on Linux Mint, so you might be surprised that it's not so easy to find once installed. Deja is pretty easy to configure. It has four tabs: The overview, storage options, folders to back up, and schedule. There's really no surprises.

The problem with Deja, though, is that it's simple but not very flexible. Scheduling, for example, allows for daily, weekly, every two weeks, or monthly. The options to keep backups are likewise pretty inflexible. But it's well-suited for desktop backups. If you need to do more frequent backups, or irregular backups (like twice a week, but not daily) then you might want to turn to Duplicity itself and write a script to do the backups you need.

Whatever you use – S3 Tools, Duplicity, Deja-Dup, or another tool – the important thing is to have backups and check them often. Happy backups!

Site Search