Backup a WordPress Amazon EC2 instance

Previously I’ve had some difficulty backing up my microistance WordPress MySql db; running mysqldump would cause 100% CPU use until I rebooted the instance and restart apache & mysql.

Why was mysqldump evil?

At least I’ve finally discovered the reason I’d get those nasty CPU spikes:

w2db_fail[1]

The culprit is, most likely, the Slimstat plugin:

screenshot-2[1]

This is a real-time web analytics plugin which has a lovely dashboard view showing you everything about who is or has viewed your site, how long they were there for, how they got there, etc. Addictive stuff.

However, in order to get these stats it saves a lot of data in a new set of mysql tables; these tables not only get really big, but are also being accessed constantly.

As such, a brief overview for backing up from the command line, as referenced in the WordPress codex has a couple of mysqldump commands similar to:

[code]mysqldump –add-drop-table -h <hostname> -u <username> -p <database name>[/code]

This will try to backup everything in the named database, which would include all Slimstat tables.

Solutions

There are a couple of approaches to this. I’ve now done both.

  1. Change your backup script to select the wordpress mysql tables only
  2. Use google analytics instead of slimstat

Using google analytics

Head over to www.google.co.uk/analytics and setup your account, then save the file it gives you onto your web host to prove it’s you. Now you can get lovely stats just by placing a tracking script reference in the footer of your WordPress theme; be careful to reimplement this if you change themes.

analytics

Selecting the WordPress tables

Get the table names for all wordpress tables (assuming you went with the default naming convention of a “wp_” prefix; if not, adapt to suit your convention):

mysql -u<username> -p<password> –skip-column-names -e "select table_name from information_schema.TABLES where TABLE_NAME like ‘wp_%’;"

Pipe it into mysqldump:

mysql -u<username> -p<password> –skip-column-names -e "select table_name from information_schema.TABLES where TABLE_NAME like ‘wp_%’;" | xargs mysqldump –add-drop-table -u<username> -p<password> <database name>

send that output to a file:

mysql -u<username> -p<password> –skip-column-names -e "select table_name from information_schema.TABLES where TABLE_NAME like ‘wp_%’;" | xargs mysqldump –add-drop-table -u<username> -p<password> <database name> > <backup file name>

In summary

Using this info I can now create a backup script that actually works; it will generate a single gzipped tar of all my website’s content files (including wordpress code) and the database backup:

[code]#!/bin/bash

# Set the date format, filename and the directories where your backup files will be placed and which directory will be archived.
NOW=$(date +"%Y-%m-%d-%H%M")
FILE="mywebsitebackup.$NOW.tar"
BACKUP_DIR="/home/ec2-user/_backup"
WWW_DIR="/var/www"

# MySQL database credentials
DB_USER="root"
DB_PASS="enteryourpwdhere"
DB_NAME="wordpress"
DB_FILE="mydbbackup.$NOW.sql"

# dump the wordpress dbs
mysql -u$DB_USER -p$DB_PASS –skip-column-names -e "select table_name from information_schema.TABLES where TABLE_NAME like ‘wp_%’;" | xargs mysqldump –add-drop-table -u$DB_USER -p$DB_PASS $DB_NAME > $BACKUP_DIR/$DB_FILE

# archive the website files
tar -cvf $BACKUP_DIR/$FILE $WWW_DIR

# append the db backup to the archive
tar –append –file=$BACKUP_DIR/$FILE $BACKUP_DIR/$DB_FILE

# remove the db backup
rm $BACKUP_DIR/$DB_FILE

# compress the archive
gzip -9 $BACKUP_DIR/$FILE
[/code]

Save that in a script, make it executable, and if you like you can add a cron task to create a backup on a regular basis. A couple of things to watch out for:

  1. These files can be big, so will use up a good chunk of your elastic block store volumes, which cost a few dollars more than a teeny EC2 instance
  2. Creating the archive can be a bit processor intensive sometimes, which may stress your poor little microinstance

Coming up

Using this backup I automate the creation of a duplicate blog and script the entire process using Chef and Vagrant!

Backup Backup Backup

Whilst out at a bar in Camden recently a friend started hunting around our area with an increasingly panic stricken look on his face. Turns out that his personal laptop had been taken from under our seats. Six years of his personal dev work lost.

The response from every technical person he told? Should’ve backed up. Also, you’ve got Prey on there, right?.. no?.. Ah..

Not exactly the most compassionate response, but for those of us with a half decent backup solution all you need worry about is getting the insurance paid out on the hardware itself.

My backup solution is pretty minimal; the only things I care about are photos and videos, documents (work and home), and coding fun.

First up – Cloud

Photos

My photos are usually taken on my Samsung Galaxy S3 so exist in several places almost by default. The SG3 is an android device, so my photos are automatically synced up to my google drive using Google Plus Instant Upload.

I have a dropbox account and the android dropbox app so they’re also being synced to dropbox; I have dropbox on my laptops at home and on my work PC, so the photos are downloaded onto those devices too.

I have the Jungledisk agent on all of my PCs which also backup those photos to my jungledisk account. It’s going to be pretty tricky to lose them all.

Documents

Documents are handled by both dropbox and Windows Live Mesh; as soon as I close a document on my office PC it’s synced via Live Mesh to SkyDrive and then downloaded again to my home PC.

Again, this is also backed up to Jungledisk as separate PC backups.

Coding fun

This also uses Windows Live Mesh, and Dropbox, and Jungledisk, but the ones I really like are also pushed up to github.

Pricey, right?

The cost of this security? Not much at all.

  • Live Mesh comes with Windows 7.

  • Dropbox is free and if you by a SG3 you get 50GB free for a year, plus 500MB free for every 500MB of photos you backup to dropbox.

  • Jungledisk costs me around £10 a month, which given that I also use it to automatically back up everything in my and my wife’s laptops’ “My Documents” folders to a secure cloud based solution, including all iTunes, iPhoto, etc folders, I think that’s money well spent; obviously some of that is also on iCloud too!

Physical? Sure!

I’m not relying on internet access to get my files back either. I have both local network and USB attached backup solutions.

External HDD

Western Digital My Passport

This tiny 500GB Western Digital Passport attaches to whichever machine I’m working on at the time and constantly backs up my entire D drive (non-OS) securely and with rollback capabilities.

NAS

LaCie LaCinema

I also have everything in key folders on each laptop within my home network being copied over to a 1TB LaCie LaCinema HD HDMI Media Centre/NAS, which is an amazing bit of kit in its own right. You can now get a 2TB version for about £160.

It Works For Me ©

I’m not saying this is what everyone should do, as requirements will obviously differ. But this works for me, and the last time one of my laptops crashed and burned I was quite happy at the chance to get a shiny new one, not having lost any important data at all.

Applications

No need to back up any applications since it’s so easy to rebuild an entire Windows system using things like Chocolatey, WebPi, and Ninite.