Backup all MySQL Databases to S3

Here is a simple script that backs up all accessible databases and sends them to an S3 bucket with the date as a prefix.

username="xxx"
password="xxx"
backuppath="/home/backup/"
s3bucket="xxx"
s3prefix=\`date "+%Y/%m/%d"\`
s3path="xxx"

# Amazon Keys
export AWS\_ACCESS\_KEY_ID=xxx
export AWS\_SECRET\_ACCESS_KEY=xxx

# Make the backup path if it is not there
if [ ! -d $backuppath ]
then
    mkdir -p $backuppath
fi

# Get a list of all databases
databases=$(mysql -N -u$username -p$password <<< "show databases" | grep -Ev "^log" | xargs);

# Loop through each database
for database in $databases
do
    echo $database

    # delete SQL file as this will be incomplete
    if [ -f $backuppath$database".sql" ]
    then
        rm -f $backuppath$database".sql"
    fi

    # delete OLD copy as this will be overwritten
    if [ -f $backuppath$database".sql.gz.old" ]
    then
        rm -f $backuppath$database".sql.gz.old"
    fi

    # make LAST copy into OLD
    if [ -f $backuppath$database".sql.gz" ]
    then
        mv $backuppath$database".sql.gz" $backuppath$database".sql.gz.old"
    fi

    # dump the NEW copy
    mysqldump -u$username -p$password $database > $backuppath$database".sql"
    gzip $backuppath$database".sql" --force

    # Send to S3
    s3cmd put -v "$s3bucket:$s3prefix/$database.sql.gz" "$backuppath$database.sql.gz"

done

echo "done"

We have another script that can backup each table of a given database – this is used for large projects where each table can be massive.

Leave a comment

Your email address will not be published. Required fields are marked *