(cross posted from the HPCloud Blog. With 75% more typos!)

One of the most basic problems with systems that need to persist data, is making sure that you can recover those systems in the case of a critical error. I’ve used and written backup systems for more time then I’d like to admit (for example). With the advent of cloud storage systems such as S3, moving your data offsite has become much easier, and much easier to recover data from your offsite storage system.

Back when I got started in this industry, a tape backup would take hours. And then you’d have to drive it to your safety deposit box and store it. When you needed to recover data, it was a drive and then hours to restore it.

The next iteration was removable hard drives. These were quicker to back up to and restore from, however the offsite portion was still onerous. That’s why I developed SyncScript - caching a local copy of the backup mades sense when most of the restore request were for things deleted in the last 24 - 48 hours.

However with the advent of Cloud Storage and higher speed internet pipes you can keep offsite backups, and get them back reasonably quickly. Since I work with OpenStack now days, that’s the hammer that makes sense to use.

Here is what I did to get a dead simple backup from a HP Cloud instance to HP’s Object Storage.

First, I installed the python tools. I followed the directions below:

For Centos 6.3:

yum install python-pip
pip install python-keystoneclient
pip install python-swiftclient

For Ubuntu 13.04:

aptitude install python-pip
pip install python-keystoneclient
pip install python-swiftclient

Then I edited the .bashrc for the user preforming the backup using nano ~/.bashrc to include this

## Enable openstack client stuff
export OS_TENANT_NAME=MY-PROJECT-NAME
export OS_USERNAME=MY-USER-NAME
export OS_PASSWORD='MY-PASSWORD'
export OS_AUTH_URL=https://region-b.geo-1.identity.hpcloudsvc.com:35357/v2.0/
export OS_REGION_NAME=region-b.geo-1

This will back up to US-East. If you wanted to backup to US-West, you’d want a .bashrc that included this:

## Enable openstack client stuff
export OS_TENANT_NAME=MY-PROJECT-NAME
export OS_USERNAME=MY-USER-NAME
export OS_PASSWORD='MY-PASSWORD'
export OS_AUTH_URL=https://region-a.geo-1.identity.hpcloudsvc.com:35357/v2.0/
export OS_REGION_NAME=region-a.geo-1

After I finshed updating the .bashrc, I went ahead and ran source ./bashrc to load the changes.

I was then able to access swift.

I tested that like this:

ubuntu@test-server:~$ swift list
Server_Backups
ubuntu@test-server:~$

I had already created the Server_Backups container via the web interface, following these directions.

Once I had my server set up to access Object Storage, I was able to use the following script and cron to automate a complete backup of everything that was important on my server.

(up to date source code lives on GitHub)