A good backup routine makes you sleep good. I have been playing around for years looking for the perfect backup tools for my server and ended up close to where I started — with a couple of standard UNIX-tools.
So, I wanted a daily backup routine that archived selected content and saved it on a remote host, not incremental but by date. The tools I use to accomplish that is SSH, cron, tar. Yes, you have them already on your server so lets get started. If you do not have a RSA key, open a terminal, and type the following:
ssh-keygen -t rsa
Make sure that you leave passphrase empty, as follows:
Generating public/private rsa key pair. Enter file in which to save the key (/home/matias/.ssh/id_rsa): Enter passphrase (empty for no passphrase): Enter same passphrase again: Your identification has been saved in /home/matias/.ssh/id_rsa. Your public key has been saved in /home/matias/.ssh/id_rsa.pub.
Now, store the public key (id_rsa.pub) on your target server, change the following command to fit your needs (user/host):
cat $HOME/.ssh/id_rsa.pub | ssh user@host "cat >> .ssh/authorized_keys"
Now you need a simple bash scripts that will archive content. Since my content is already compressed, I use tar to pack the source files with hostname and date, as follows:
#!/bin/bash DATE="`date +"%Y-%m-%d"`" SOURCE="/path" TARGET="/path/$HOSTNAME-$DATE.tar" tar -cvf - $SOURCE | ssh user@host "cat > $TARGET"
Name the file for example backup.sh and continue. Now, put this into your cron and choose when you want the script to run (below is nightly at 04:30).
30 4 * * * bash /home/matias/backup.sh > /dev/null 2>&1
The reason I pipe the tar file to SSH is that I do not want to store the file temporary since it could be big. Second is that I do not want to use SCP since the speed is notable slower. Enjoy!