Over the past few weeks we have become very interested in the Amazon s3 backup solution main because it is extremely cheap.

I tried many different solution before for finding on the works for us, you may not find this works for you but give it a try and see.

First off to get started it is a good idea to take a look at the Amazon S3 class by undesign this is by far the easiest class to use in my experience, i have also used Amazons S3 SDK kit which is also good but a bit overkill for our needs.

Ok so browse through the Amazon S3 Class and get to grips with ways to communicate with amazon S3. Below are some examples from the website.

//Example usage

$s3 = new S3('accessKey', 'secretKey');
$s3->putBucket('bucket', S3::ACL_PUBLIC_READ);
$s3->putObjectFile('file.doc', 'bucket', 'docs/file.doc', S3::ACL_PUBLIC_READ);
$s3->deleteObject('bucket', 'docs/file.doc');

// Writing and reading resources:

# Upload an object from a resource (requires bytesize):
$s3->putObject($s3->inputResource(fopen($file, 'rb'), filesize($file)), $bucketName, $uploadName, S3::ACL_PUBLIC_READ);
# Download an object to a resource:
$s3->getObject($bucketName, $uploadName, fopen($savefile, 'wb'));

First off i needed to do a full backup of all my website files and folder and then compress and save to the root in my backup/files directory.

I looked at a few php classes to do this but found by far the easiest way for me was to just run this bit off code, i found this from this great article Cron Job Backup.

<?php
//Uncomment the line below and run file in browser to find your server root.
//print_r($_SERVER['ROOT']);
$date = date("F-j-Y-g-ia");
echo exec("cd /home/your server root/public_html/backup/files;tar -cvpzf ibc-$date.tar /home/your server root/public_html"); ?>

As you can see it is backing the files up to /backup/files make sure you have this folder structure.
After running this file you should now see a backup . tar within your /backup/files folder.


Great now to create a file to upload to Amazon S3 here is the code i used.

<?php 

require_once('classes/S3.php');

// Enter your amazon s3 creadentials
$s3 = new S3('Your Key', 'Your Secret Key');

$baseurl = $_SERVER['DOCUMENT_ROOT'] . "/backup/files"; // files saved to files directory

if ($handle = opendir('./files/')) {
    while (false !== ($file = readdir($handle))) {
        if ($file != "." && $file != "..") {

			if ($s3->putObjectFile("files/$file", "Your Bucket Create it in your amazon s3 account", "folder/$file", S3::ACL_PUBLIC_READ)) {
					echo "<strong>We successfully uploaded your file.</strong>";
					//this will delete the file from your server after upload
					if (file_exists($baseurl . '/' . $file)) { unlink ($baseurl . '/' . $file); }
}else{
					echo "<strong>Something went wrong while uploading your file... sorry.</strong>";
}

        }
    }
    closedir($handle);
}

?>

All there is left to do is create cron jobs to run these file i m with ICUK hosting and they have a very simple interface to do this.

So i setup two Cron Jobs one to run the bk.php at 1:00am and then on to run the upload.php at 1:30am give it enough time to backup the website files and folders first 30mins is plenty for me.

Here are all the files. Just extract backup folder and upload to your root via ftp and change the settings to suit your needs and issues contact us.

Warning Read This

Ok just a word off warning this works fine on our hosting ICUK but have had some issues with other hosting where access is not allowed to delete the file after it is uploaded to amazon so the process the becomes really heavy on the servers as it doubles up each time and they will not be happy, i will leave this up because i know it works for me but please modify change and test before using on your own hosting ;).

Amazon S3 Backup

One thought on “Amazon S3 Automatic Backup Solution Cron Job

Leave a Reply

Your email address will not be published. Required fields are marked *

*
*
Website