Cooking with Linux

Mirror, Mirror, of It All

Marcel Gagné

Issue #114, October 2003

Make simple backups and keep every copy of your Web or FTP site up to date with some standard tools that probably are already on your system.

François, what are you doing? When I asked you to mirror our Web sites, I did not mean that you should hold a mirror up to the screen. You can be very silly, mon ami. What I meant was that you should make a copy of our Web sites onto that other machine. François, what are you looking at? Ah, our guests have arrived! Why did you not tell me? Welcome, mes amis, to Chez Marcel, home of fine Linux fare and exceptional wines.

Speaking of wine, François! To the wine cellar, immédiatement. Please bring back the 1999 California Stag's Leap District Cabernet Sauvignon. This bold, smooth wine is the perfect mirror to today's menu. As you know, mes amis, the theme of this issue is system administration. On today's menu, we are going to sample a number of alternatives for mirroring data. The reasons for mirroring data are many. The obvious first reason is the not altogether sexy but extremely important subject of backups. Other reasons include creating mirrors of FTP sites for local network updates, such as your own RPM update repository, or mirroring Web sites for fast, off-line reading.

Many people who do regular backups are doing them to a disk on one of their other machines. Others still are backing up to a second disk on the same machine. Given that an extra hard drive added to a system is extremely inexpensive these days and high-capacity tape drives can cost substantially more, it isn't that unusual to find this kind of solution being used.

Backing up from one disk to the other, or creating a mirror of your data, can be as simple as doing a recursive copy using cp. For instance, if I wanted to copy everything in my home directory to a second disk with a lot of space, I might do the following:

cp -rfupv /home/mgagne /disk2/

As you probably expect, the -r option indicates a recursive copy (all the subdirectories), and the -v tells the command to be verbose. Because I don't want to be warned about each file being overwritten, I add -f to force the copy; the -p ensures that permissions are saved properly as well. Finally, the -u option tells the cp command to copy only files that have been updated. This speeds up the process on subsequent copies.

It all works very well, but copying from machine to machine requires a few extra steps. With your Linux system, you actually have a lot of tools at your disposal beyond the humble cp. For starters, if you want to copy or back up an entire Web site, try the wget command, originally written by Hrvoje Niksic:

wget -m http://www.websitename.dom

Starting at the top of your chosen Web site, wget walks through the entire site, saving all appropriate HTML files and images. The -m in this case means mirror, but it also encompasses several other options, specifically -r, -N, -l inf and -nr. These options tell wget to do a recursive fetch, turn on timestamps, allow for an infinite number of levels and not to remove the FTP directory .listing files, respectively.

All files on the Web site are saved in a local directory with the same name as the Web site. In the example above, that would be www.websitename.dom. Add a new file to your Web server, run the command again and only that new file is transferred, thus making the job of keeping things up to date that much faster.

This is a great tool for its intended purpose, but its primary function is to deal with Web sites. It is possible, however, to use wget to download from FTP servers as well. If you are transferring from anonymous sites, the format is almost identical to the one used to mirror a Web site:

wget -m ftp://ftp.ftpsitename.dom

If, on the other hand, you want to back up a user directory where a user name and password are required, you need to be a little fancier:


wget -m ftp://username:password@ftp.sitename.dom

This approach has a couple downsides. First, your password is sent across the network in plain text, which may not be a big deal depending on how much you trust your network. In a pinch, you could do a recursive secure copy with the scp command. Because scp is part of OpenSSH, you have the advantage of knowing that you are using secure, encrypted file transfers. Pretend that you want to copy your whole Web site, starting from the Apache server root. It would look something like this:


scp -rpv /var/www root@remote_host:/mnt/backupdir

The -r indicates a recursive copy, and the -p tells scp to preserve modification times, ownership and permissions from the original files and directories. If you are transferring large amounts of data, you might consider using the -C option, which does compression on the fly. It can make a substantial difference in throughput.

Possibly the biggest problem with all these methods of mirroring data is it can take a great deal of time. wget will download new files from an FTP server, but there is no option to keep a directory entirely in sync by deleting files. Secure copy is nice, but it doesn't have any mechanism for transferring only files that are changed. That's the second downside. Making sure that the data stays in sync without transferring every single file and directory requires a program with a bit more finesse.

The best program I know for this is probably Andrew Tridgell's rsync. Linux Journal's own Mick Bauer did a superb job of covering this package in the March and April 2003 issues of this fine magazine, so I won't go over it again other than to say you might want to look up his two-parter on the subject.

ftpcopy

In many cases, that leaves us with our old friend, FTP—well, sort of. On one side (the machine you want to mirror), you would use your FTP server, whether it was ProFTPD or wu-ftpd. On the other side, you would use Uwe Ohse's ftpcopy program. ftpcopy is a fast, easy-to-set-up and easy-to-use program that does a nice job of copying entire directory hierarchies. As it copies, it maintains permissions and modification dates and times, and it does it fast. Furthermore, it keeps track of files that already have been downloaded. This is handy because the next time you run ftpcopy, it transfers only those files that have been changed, thus making your backup even faster.

Some distributions come with ftpcopy, but for the latest version of ftpcopy, go to www.ohse.de/uwe/ftpcopy/ftpcopy.html to pick up the download. Building the package is easy and takes only a few steps:

tar -xzvf ftpcopy-0.6.2.tar.gz
cd web/ftpcopy-0.6.2
make

In the directory called command, you'll find three binaries: ftpcopy, ftpcp and ftpls. You can run it from here or copy the three files to /usr/local/bin or somewhere else in your $PATH.

Here's how it works. Let's say I wanted to mirror or back up my home directory on a remote system. A basic ftpcopy command looks something like this:

ftpcopy -u marcel -p secr3t! \
remote.hostname /home/marcel /mirdir/

The -u and -p options are obviously for my user name and (fake) password on the remote system. What follows is the path to the directory you want to copy and then the local directory where this directory structure will be re-created. As the download progresses, you will see something like this:

/mirdir/scripts/backup.log: download successful
/mirdir/scripts/checkhosts.pl: download successful
/mirdir/scripts/ftplogin.msg: download successful
/mirdir/scripts/gettime.pl: download successful

If you want a little more information on your download, add the --bps option. The results then report the rate of data transfer in bytes per second.

You should consider running ftpcopy with the --help option at least once, and you should be aware of some options. For instance, -s deals with symbolic links, and -l lets you increase the level of logging. If you want to set mirroring to run by means of a cron job, you might want to set logging to 0. Another useful option is -n. If a file is deleted on the remote side, it also will be deleted locally when you run ftpcopy. If you truly are trying to keep systems in sync, this is what you would want. To override this behavior, add -n and no deletes will occur.

Well, mes amis, the hour has arrived, and we must all go to our respective homes. Still, it is early enough for a final glass of wine, non? François, mon ami, if you will do the honors—in fact, make it two glasses, one to mirror the other, non? Until next time, mes amis, let us all drink to one another's health. A vôtre santé Bon appétit!!

Marcel Gagné (mggagne@salmar.com) lives in Mississauga, Ontario. He is the author of the newly published Moving to Linux: Kiss the Blue Screen of Death Goodbye! (ISBN 0-321-15998-5) from Addison-Wesley. His first book is the highly acclaimed Linux System Administration: A User's Guide (ISBN 0-201-71934-7). In real life, he is president of Salmar Consulting, Inc., a systems integration and network consulting firm.