Jesse Lawson

Software engineering, artificial intelligence, writing, and open-source tools

Jan 1, 0001 - Code Snippets Nginx Server Admin Tutorials

How to get Automatic Nightly Backups on Nginx

Here’s a copy of my automatic nightly backup script, which cycles through a list of clients, dumps their MySQL DB, ZIPs their WP files, and pushes the backups out to s3 while maintaining a daily.zip backup on our local system.

In crontab -e, I added the following entry:

<h1>Our backup script. Nightly at 5:20AM.</h1>

20 5 * * * /var/tools/do-daily-backups.sh

I then created a file called /var/tools/do-daily-backups.sh, which looks like this:

<h1>!/bin/bash</h1>

<h1>This automated backup script is designed for use with the DashingWP file system.</h1>

<h1>Author: Jesse Lawson</h1>

<h1>(12-Jan-2014 lawsonry) This backup system only backs up Venus server clients that are located in a plain text file called list_of_clients</h1>

echo -e "Beginning backup of Venus server..."

<h1>(12-Jan-2014 lawsonry) The first thing we'll do is contact the remote server and get a list of all the directories in /var/www</h1>

<h1>We do this by calling ls/var/www, and then looping through the results and checking that the dir name contains</h1>

<h1>".dashingwp.com" on the end -- that means it's a hosted site and requires backups.</h1>

<h1>(12-Jan-2014 lawsonry) Create an array to store the client folders</h1>

list_of_clients=()

<h1>(12-Jan-2014 lawsonry) Get the list of directories from Venus</h1>

for item in <code>MARKDOWN_HASHf6ccbf19111bb4d709eed2c79e317d96MARKDOWN_HASH</code>
do
        # Check if item is a dashingwp client site
        if [[ $item = *.dashingwp.com ]]; then

<pre><code>        echo -e &quot;Found client \&quot;$item\&quot;&quot;

        list_of_clients+=(&quot;$item&quot;)


    fi
</code></pre>

done

<h1>(12-Jan-2014 lawsonry) We are calling a separate loop because I want to clearly differentiate between ssh calls</h1>

<h1>(12-Jan-2014 lawsonry) We'll get rid of this ssh loop because we're about to rsync from a bunch of other ones.</h1>

echo -e "Found $counter clients."
echo -e "Filtering out clients to new array..."

for client in ${list_of_clients[@]}; 
do
    # (12-Jan-2014 lawsonry) Set internal name for backups. "Cut string delimited by periods (-d .) and only retrive the first part (-f 1)
    internal_name=$(echo $client | cut -d . -f 1)
    NOW=$(date +"%d")
    FILE="$internal_name.daily_$NOW.zip"
    BACKUP_DIR="/var/www-backups/$internal_name"
    WWW_DIR="/var/www/$internal_name.dashingwp.com/"

<pre><code>echo -e &quot;Backing up $client&quot;

echo -e &quot;--&amp;gt; Pulling directory from Venus...&quot;

# (12-Jan-2014 lawsonry) Purposefully exclude those giant wp-snapshots just in case people are using them. In the future, we could probably
# exclude all backup dir taxonomies to ensure that people who use backup plugins even though we tell them not to
# are not having those backups included in our rsync transfers and zipped backups (they'll take up way too much room)
rsync -avzhe ssh --exclude='*wp-snapshots*' root@12.345.67.890:/var/www/$client /var/www

# (12-Jan-2014 lawsonry) Now that we've rsync'd the hosting site over to Mercury, a &quot;fresh&quot; copy of the site exists. This means that the bulk of CPU power
# on Venus during backups is taken up during the initial rsync (and any future rsyncs where there are lots of information to exchange.)
# Assuming everyone installs and adds all the files they're going to add on day one, Venus should only experience high CPU load during Rsync 
# while transferring the data to Mercury for the first time. After that, rsync will only transfer over the changed files, making future rsyncs
# for each client sites very fast, and taking the CPU load off of Venus very quickly. 
#   After this rsync, the only other load Venus will incur is a mysqldump (below). 

echo -e &quot;--&amp;gt; Remotely dumping $client database...&quot;

# (12-Jan-2014 lawsonry) Navigate to folder and backup database to a file in wp-content
cd /var/www/$client

# (12-Jan-2014 lawsonry) Backup database to wp-content folder

# (12-Jan-2014 lawsonry) Extract db variables from config file

echo -e &quot;--&amp;gt; --&amp;gt; Extracting database credentials...&quot;

DB_NAME=`cat wp-config.php | grep DB_NAME | cut -d \' -f 4`
DB_USER=`cat wp-config.php | grep DB_USER | cut -d \' -f 4`
DB_PASS=`cat wp-config.php | grep DB_PASSWORD | cut -d \' -f 4`
DB_FILE=&quot;mysql.sql&quot;

echo -e &quot;--&amp;gt; --&amp;gt; DB_USER=$DB_USER&quot;
echo -e &quot;--&amp;gt; --&amp;gt; DB_PASS=$DB_PASS&quot;
echo -e &quot;--&amp;gt; --&amp;gt; DB_NAME=$DB_NAME&quot;

echo -e &quot;--&amp;gt; --&amp;gt; Creating temporary backup directory...&quot;

# mkdir if it doesn't exist. It's only temporary
mkdir -p $BACKUP_DIR

echo -e &quot;--&amp;gt; --&amp;gt; Commencing remote mysqldump... &quot;
# Dump database to mysql.sql in wp-content folder

ssh root@192.168.131.21 &quot;mysqldump -u$DB_USER -p$DB_PASS $DB_NAME&quot; &amp;gt; /var/www/$client/htdocs/wp-content/mysql.sql
#(mysqldump -h192.168.131.21 -u$DB_USER -p$DB_PASS $DB_NAME &amp;gt; &quot;/var/www/${client}/htdocs/wp-content/${DB_FILE}&quot;) &amp;amp; think

echo -e &quot;--&amp;gt; --&amp;gt; Copying wp-config to wp-content directory... &quot;
# Copy over the wp-config.php file so that we can just zip the htdocs folder
cp wp-config.php htdocs/wp-config.php

# Go into htdocs so the zip goes straight to the directory
cd /var/www/$client/htdocs

# specifically exclude wp-snapshots because SOMEONE keeps using that plugin
# shouldn't have to exclude backups, but here it is just in case: --exclude=*wp-snapshots*
echo -e &quot;--&amp;gt; --&amp;gt; Updating $client's snapshot (daily.zip)...&quot;

# (12-Jan-2014 lawsonry) The --filesync arg synchronizes our rsync'd directory with a 'daily.zip' file that exists at 
# /var/www-backups/internal_name/daily.zip. filesync is explained on this page: http://www.info-zip.org/mans/zip.html
# Basically, it checks for changes in our local (Mercury) copy of the site, and then only modifies the daily.zip to 
# account for changes in that local copy (Mercury). 
zip -9 -r --filesync --quiet $BACKUP_DIR/daily.zip .

# Note here we're zipping the contents of htdocs to omit the log folder

# Now that we've updated our daily snapshot, we need to copy it to a daily_# file so we can upload it.

echo -e &quot;--&amp;gt; --&amp;gt; Duplicating snapshot to $FILE...&quot;

cp $BACKUP_DIR/daily.zip $BACKUP_DIR/$FILE

# Push to S3 
echo -e &quot;--&amp;gt; --&amp;gt; Pushing $FILE to Amazon S3...&quot;     
s3cmd put $BACKUP_DIR/$FILE s3://dashingwp/snapshots/$internal_name/$FILE

# Remove backup dir and wp-content from htdocs (see daily work journal, 9 Jan 2014)
echo -e &quot;--&amp;gt; --&amp;gt; Removing $FILE...&quot;

rm -rfv $BACKUP_DIR/$FILE

# The following data stays on Mercury:
# /var/www/&amp;lt;internal_name&amp;gt; -&amp;gt; This is a nightly snapshot of the sites on Venus
# /var/www-backups/&amp;lt;internal_name&amp;gt;/daily.zip -&amp;gt; This is a nightly zip archive of the above snapshot
# Note that we're keeping our local (Mercury) copy of the rsync'd data because it lessens the load on Venus during backups,
# and allows us to use zip --filesync to greatly speed up zipping operations. 

echo -e &quot;&amp;lt;-- &amp;lt;-- $internal_name backup is complete.&quot;

# Write to backup log file
#TIMESTAMP=$(date +'%Y-%m-%d %H:%M:%S|%N')
#echo &quot;Daily #$NOW backup for $internal_name completed at $TIMESTAMP\n&quot; &amp;gt;&amp;gt; /var/www-backups/log/daily-backups.log
</code></pre>

done

echo -e "Finished backups."

<h1>load the list_of_clients file into an array to work with</h1>