11 minute read

Backing up a WordPress Database to Amazon S3 Yes, there are plugins, but this is easy.

Ginseng roots in a market in Seoul

In this post we are going to look at a simple way to backup your WordPress database to Amazon S3.  You will find that the procedure is simple and robust and almost all of the code is very straightforward.

Why Not Use a Plugin?

I can tell what you are thinking.  You are saying to yourself that you have seen a multitude of plugins that do something similar and why would you write your own code to do the same thing?

My answer to this is that with this method you control all of the code.  With a plugin, I find that if something breaks it either goes unnoticed, is difficult to debug, or you have to wait for a patch from the developer.  I am also not terribly fond of having little insight into what is happening with my data and not having very finite control of when backups are taken.  Backing up your WordPress database to Amazon S3 is so simple that I like being in total control of the process.

Why Amazon S3?

Amazon S3 is ideal for this type of data storage because it is cheap (5GB for free), reliable, cloud based, fast, and they provide a great PHP library that is easy to interface with.  In addition, you can browse your old databases online or create a semi-automated solution for restoring from your old databases.  The option to download the database and restore via a mysqlimport is a simple option too.

This Is Easy!

It may seem like there are a lot of steps needed to get this going but I have been overly descriptive for the sake of clarity.  The whole process to get this up and running should take you less then 20 minutes and once you do it once or twice it will become second nature and a pleasure to do.

I encourage you to at least take a look at the process below and decide whether this method is right for you.  You may also have tips on how to improve the code and I would love to hear your comments.

1.  Create an AWS Account and S3 Credentials

It should be obvious that for this to work you will need an Amazon AWS account.  This is free and straightforward.  We will briefly cover the steps blow.

If you don't already have an AWS account you can visit http://aws.amazon.com and click the 'Sign Up' button.  Fill in your information and verify that you can login.  If you can't get past this step I am not sure you are ready for the internet and you should practice signing up for various sites until you get this down pat.

2.  Visit the AWS Management Console

If you aren't already there please go to http://aws.amazon.com and under 'My Account/ Console' select 'AWS Management Console' as shown below.  You may have to login if you haven't already.

Login to Amazon AWS

3.  Click on S3 to visit to S3 Console Home.

Again, this is straightforward but I have included a screenshot below in case you can't find it.

Navigate to the S3 Console on AWS

4. Create a Bucket That Will Store Your Database Backups

For S3 a 'Bucket' is basically a file folder that will contain all of your backups.  In our script we will be gzipping our database dumps so that each backup will be contained within a single file and also compressed.  Do you see that big blue button that says 'Create Bucket'?  Give that a little click so that we can create your bucket.

Create a Bucket on Amazon AWS

Good job.  Now you will need to enter the name you want for your bucket and which Region you want it in.  I tend to pick 'US Standard' because it allows characters like period and I tend to use those.  I believe the other Regions don't allow that.  It shouldn't matter too much which Region you pick as long as it is in the same general area as you.  The worst that would happen is that your uploads and downloads would take marginally longer.

I tend to name my buckets something like {site-name}-bucket.  The reason the bucket part is on there is that in the script we will write it makes it clear what you are talking about.  It will be a nice little reminder that your brain will say "Oh, that's the name of the bucket on S3."

Fill in the Bucket Details on AWS

Click create and Amazon will create your brand new S3 bucket.  Now, you should see a screen similar to the following that will show you that the bucket has been created.

Verify Your Bucket has Been Created on AWS

5. Create Access Keys for AWS

For your server and script to communicate with AWS you will need to create access keys using IAM (it's simple).  Luckily, Amazon has a short step-by-step guide on how to do this.  It should take less then a minute.

Visit the About Access Keys page on Amazon and follow the steps under 'Create Access Keys'.

Important steps that should be noted are the 'Select Policy Template' should be set to Amazon S3 Full Access and the Download Credentials will be needed.

Use IAM to Create Security Credentials

Open the downloaded CSV and you will see that your Key and Secret are stored there.  We will need those in a bit.

6. Set Up the Backups Folder on Your Server

Now for the fun stuff.  Login to your server and navigate to your WordPress wp-content folder.  Usually, I create a directory in this folder called /backups/ to store both the code needed and the backups themselves.  Really, you could put this anywhere on your server.

I tend to do it this way because my Git repo begins at /wp-content and I like the backup code to be saved within the repo.  I also gitignore /backups/*.gz so my gZipped databases are not within the repo itself.  Saving the *.gz in the repo could be a backup solution in itself if you wanted and the backups were small.

$ cd /path-to-your-wordpress/wp-content/
$ mkdir ./backups/

In this folder we want to create an index.php file to serve a blank page if someone tries to navigate to it.  This is the same as what WordPress does in the wp-content and other folders.  This file will just contain an opening PHP tag and an optional comment.

$ cd /path-to-your-wordpress/wp-content/backups/
$ vim index.php

Now, add the following code to that file and write and exit back to the command line.

<?php
// Silence is Golden

Perfect. That should help guard against someone trying to snoop on this directory.

Index.php for Empty Directory

7. Download the AWS PHP SDK (Wrapper)

Now we are ready to download the AWS PHP Wrapper that provides a simple code-based interface to S3.  The simplest method to do this is to use Composer.  For those of you that are familiar with NodeJS, npm, and package.json this will feel familiar to you.  Basically, Composer is a dependency manager for PHP.

Amazon has a straightforward step-by-step guide on how to do this.  Please visit the Amazon Installing via Composer page and follow Steps One, Two and Three.  Please make sure that you are in your newly create /wp-content/backups/ folder when you do this.

The basic steps that are outlined there are:

  1. Create a composer.json to tell Composer which dependancies you require.
  2. Do a curl request to install Composer
  3. Use Composer to install the Amazon AWS PHP SDK

This is all pretty simple and straightforward.

8. Make the Script for Performing Our Backups

Create a file named 'database-backup.php' in your /wp-content/backups/ directory.  This is where you will store all of the code that actually performs the backup.  I have included a Gist below with the code and comments that should guide you through how this works.

Please note that you will need to add your $bucket_name, $admin_email, $access_key, and $secret_key to get the code to work correctly.

Also, I have included a catch for if something goes wrong with the upload.  Basically, if the upload fails then you will receive an email that tells you that the script failed and what went wrong.  If you want to go crazy with this you could add more exception handlers at each step.  In my experience, I have yet to see this fail and I have something similar running on many sites that backup every hour of every day with large databases.

<?php
/* OPEN A STD OUT STREAM, THIS IS BETTER THEN ECHOING
 * BECAUSE THERE IS NO BUFFERING
 * *********************************************************************/
$stdout = fopen( 'php://stdout', 'w' );

// Starting Processing Designator
fwrite( $stdout, "========== BACKUP ==========\n" );

/* Use the Autoload from Composer 
 * *********************************************************************/
fwrite( $stdout, "  Autoload..." );
  require_once( "./vendor/autoload.php" );
fwrite( $stdout, "complete\n" );

/* Load wp-config (DB CREDENTIALS)
 * *********************************************************************/
fwrite( $stdout, "  Loading wp-config..." );
  $parse_uri = explode( 'wp-content', __FILE__ );
  require_once( $parse_uri[0] . 'wp-config.php' );
fwrite( $stdout, "complete\n" );

/* CREATE A TIME STAMPED FILENAME AND THE FULL PATH TO THAT FILE
 * *********************************************************************/
fwrite( $stdout, "  File Setup..." );
  $backupfile = DB_NAME . '-' . date("Y-m-d--H-i-s") . '.sql.gz';
  $backupdir = dirname(__FILE__);
  $backupfile_fullpath = $backupdir . '/' . $backupfile;
  /* EDIT THESE VARIABLES BELOW: $bucket_name and $admin_email */
  $bucket_name = 'your-bucket-name'; //like ryanfrankel.com-bucket that we created
  $admin_email = 'your-email-address'
fwrite( $stdout, "complete\n" );

/* MYSQL DUMP THE DATABASE AND GZIP (YOU CAN COMPRESS HOW YOU WANT)
 * *********************************************************************/
fwrite( $stdout, "  MySQL Dump..." );
  $command = "-u " . DB_USER . " --password='" . DB_PASSWORD . "' " . DB_NAME;
  system( "mysqldump $command | gzip > $backupfile_fullpath" );
fwrite( $stdout, "complete\n" );

/* CREATE AN S3 CONNECTION (CLIENT)
 * *********************************************************************/
fwrite( $stdout, "  Creating S3 Client..." );
	use Aws\S3\S3Client;
	/* ADD YOUR ACCESS AND SECRET KEY HERE */
	$access_key = "your-access-key";
	$secret_key = "your-secret-key";
	$s3 = S3Client::factory( array(
		'key' => $access_key,
		'secret' => $secret_key
	) );
fwrite( $stdout, "complete\n" );

/* WRITE THE OBJECT (FILE) TO S3
 * *********************************************************************/
fwrite( $stdout, "  Writing $backupfile to S3..." );
// putObject returns an exception on failure so we can use try, catch.  
// If there is some sort of error, send an email alerting us that there
// is an issue.  You could add this to steps above if you wanted.
try {
	// This is the function that puts the file on S3
	$response = $s3->putObject( array(
		'Bucket' => $bucket_name,
		'Key' => $backupfile,
		'SourceFile' => $backupfile_fullpath
	) );
	fwrite( $stdout, "complete\n" );
} catch( Exception $e ) {
	fwrite( $stdout, "exception error...\n\n" );
	// Get the error message
	$error_message = $e->getMessage();

	// email on error
	fwrite( $stdout, "    Emailing Admin..." );
		$email_message = "BACKUP FAILURE\n==========\n$backupfile\n$error_message\n==========\n";
		$email_result = mail( $admin_email, 'RyanFrankel.com: BACKUP FAILURE', $email_message );
	fwrite( $stdout, "complete\n" );
}
  
/* DONE, CLOSE THE STD OUT
 * *********************************************************************/
fwrite( $stdout, "========== BACKUP COMPLETE ==========\n\n" );
fclose( $stdout );
?>

For the sake of clarity, I have the $access_key, and $secret_key in this file.  Ideally, you would have this stored in another file that is outside the root of your site.  If you do decide to keep the code like this please do not save this to a public git repo.  You will have your keys exposed and that means that anyone can create a client to talk to your S3 account.

9. Run the Database Backup and Check if Everything Is Working

Believe it or not that code is all we really need to backup our WordPress database to Amazon S3.  If you have any questions about how this code works please just ask in the comments below.

Now, let's run this file and make sure that everything is working.

cd /path-to-your-wordpress/wp-content/backups/
php database-backup.php

You should see an output similar to the one below.
The Output of the WordPress Backup Script

If you get something similar and don't receive an error or an email then you can check your /backups/ directory to make sure that the .gz was created.

Verify that WordPress Database Backup was Created

Also, it is a good idea (and rewarding) to check Amazon S3 to make sure that the file was sent and received.

Verify your WordPress Backup is on Amazon S3

Boom!  Now our database snapshot is on S3 and downloadable from anywhere at anytime in the event of a disaster.

10.  Set Up a Cron to Take a Database Backup

I kind of glossed over the fact that in step 9 you only had to run one command to backup your database.  That's pretty sweet.  Now, you can set up a cron to run that will run this command as often as you want.  For most smaller scale sites you can safely run this once, or twice a day.  For a larger site that has a lot of moving parts, content creators, or backend processing you can run this more often.  On important sites I run this every hour.  There have been quite a few times that having that sort of granularity was a life saver.  For something like this site I will probably have this running twice a day.

To set up a cron you just need to go to crontab and set up the commands appropriately.  Below I will set it up to run twice a day at 12:30PM  and 11PM.

Open Crontab:

$ crontab -e

Add the cron that will run this script

# m h dom mon dow command
30 12,23 * * * cd /path-to-your-wordpress/wp-content/backups && php database-backup.php

Ok, now we are all set and we should have backups running every day at those times.

Conclusion

Hopefully, you learned something with this post or it at least gave you insight into how this would be done.  You can use S3 for all sorts of fun stuff.  Also, you can create a script that will download and install your databases from S3.  This can be REALLY useful if you use a separate VPS for development or in a crisis.  I will make another post about restoring your database from a backup at some point but all you really have to do is do a mysqlimport and you are done.

Some things to watch out for:

  1. Every now and again you will want to delete the backups out of the /backups/ folder.  I don't delete them automatically because I like the fail-safe if S3 has issues.
  2. You may also want to delete your old database backups out of S3 every now and again.  Amazon charges by how much data is stores there and these can accumulate over time.

So, that is pretty much it for this post.  If you have any questions please leave a comment below and if you liked this article you could share it on your favorite social network.  I am having a blast writing these intermediate/advanced WordPress tutorials and hopefully people are finding them useful or interesting.


Leave a Reply

Your email address will not be published.


  1. Just bookmarked this.
    Can you suggest a solution that would use S3 (or similar) as a backup for an entire site? What I want is a fail-safe in case the entire server goes down, I could just point my domain to a new nameserver. Then traffic would flow to the backup copy.