In a previous post about backing up EC2 MySQL to an Amazon S3 bucket we covered dumping MySQL datasets, compressing them and uploading to S3. After a few weeks test-driving the shell script, I came up with a new version that checks, fixes and optimizes all tables before generating the dump. This is pretty important as mysqldump will fail on whatever step would cause an error (data corruption, crashed tables, etc), thus your uploaded to S3 archive would be kind of corrupt. Here’s the script:
filename=mysql.`date +%Y-%m-%d`.sql.gz echo Checking, Fixing and Optimizing all tables mysqlcheck -u username -p password --auto-repair --check --optimize --all-databases echo Generating MySQL Dump: ${filename} mysqldump -u username -p password --all-databases | gzip -c9 > /tmp/${filename} echo Uploading ${filename} to S3 bucket php /ebs/data/s3-php/upload.php ${filename} echo Removing local ${filename} rm -f /tmp/${filename} echo Complete <pre> There you go. If you remember my previous example I stored the temporary backup file on Amazon EBS (Elastic Block Storage) which is quite not appropriate. Amazon charges for EBS storage, reads and writes, so why the extra cost? Dump everything into your temp folder on EC2 and remove afterwards. Don't forget to make changes in your upload.php script ($local_dir settings). Also, just as a personal not and to people who didn't figure out how to upload archives with data to S3, here's another version of the script which takes your public_html (www, htdocs, etc) directory, archives it, compresses and uploads to an Amazon S3 bucket: <pre>filename=data.`date +%Y-%m-%d`.sql.gz echo Collecting data tar -czf /tmp/${filename} /ebs/home/yourusername/www echo Uploading ${filename} to S3 bucket php /ebs/data/s3-php/upload.php ${filename} echo Removing local ${filename} rm -f /tmp/${filename} echo Complete
Oh and have you noticed? Amazon has changed the design a little bit, and woah! They’ve finally changed the way they show the Access Secret without a trailing space character! Congrats Amazon, it took you only a few months.
How do you restore the public_html directory from the tar file?
Something like
tar -xf archive.tar.gz
Next to your public_html directory.