Simple Load Balancing Using Nginx

While the concept of load balancing has been around for a while, using Nginx to do this is fairly new to me. Other common load balancers in use today are LVS, HAProxy, Perlbal and Pound. In this example, I am using 3 (ve) servers from Media Temple, each running on Ubuntu 9.10. To get started, log into the server you want to set up as the load balancer and install Nginx:

aptitude install nginx

Then we'll need to make a new Nginx default virtual host, /etc/nginx/sites-available/default. In the two server directives under the upstream backend section, be sure to put in the IP addresses or hosts you are balancing:

upstream backend  {
  server 123.123.123.123;
  server 123.123.123.124;
}

server {
  location / {
    proxy_pass  http://backend;
  }
}

This is the simplest configuration possible. After you've completed the proxy configuration, test and restart Nginx:

nginx -t
/etc/init.d/nginx restart

At this point, requests handled by the load balancer will serve equal requests from each upstream server. The really nice thing is that if one of the upstream servers is not responding, the load balancer will automagically stop routing requests to it. So although the configuration has the unavailable server loaded, Nginx sees that it is down and then routes to all other available upstream servers. If all upstreams are down, Nginx will halt the proxy, simply showing a 502 http error. This also makes it a very useful tool for balancing mongrels for those running rails.

It's also worth noting that you can add as many backend nodes as you want; I'm just using two as an example. This makes using a (ve) server an ideal choice; spin up a new (ve) and simply add it to the upstream pool. There are a few other configuration options that provide the ability to add weight to backends to force an uneven load distribution. You can also use proxy_set_header to make sure the user's IP address is logged instead of localhost:

upstream backend  {
  server john.tjstein.com;
  server paul.tjstein.com;
  server ringo.tjstein.com;
  server george.tjstein.com weight=30; # George was always my favorite
}

server {
  server_name www.tjstein.com tjstein.com;
  location / {
    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_pass  http://backend;
  }
}

Nginx provides many other configuration directives so I would recommend checking the official Nginx documentation on the NginxHttpUpstreamModule for more information. In the next post, we'll look at using a few file synchronization tools like Unison and rsync for data replication between backends.

Fixing Unsupported Protocol Error with PECL

After setting up PHP 5.2.10, I wanted to install a few additional modules using PECL. I found soon after that I was unable to do so, receiving the following error:

pear.php.net is using a unsupported protocol - This should never happen. install failed

The problem seems to exhibit itself with corrupted PEAR installations in PHP 5.2.9 and 5.2.10. To fix this, just backup and update the channels:

mv /usr/local/lib/php/.channels /usr/local/lib/php/.channels.bak
pear update-channels

After that, you should be good to go.

Media Temple (gs) Grid Service Domain & MySQL Backups

Having been on many shared hosting platforms before, I am fortunate enough to have a few (gs) Grid Services with (mt) Media Temple. Although no native backup tool currently exists for the (gs), I've created some simple backup scripts to backup the domain directories and MySQL databases. Both are very simple bash scripts that can be set up as cron jobs from with the AccountCenter. The first backup we'll create is for domain backups:

  1. If you haven't done so already, enable SSH from the (mt) Media Temple AccountCenter.

  2. Once you've logged in via SSH, navigate to the /data directory. Keep in mind you'll need to add your site number in the 00000 for this to work. You can find your site number in the Server Guide > Access Domain section of the AccountCenter.
  3. cd /home/00000/data
  4. Run the following commands to create and set permissions of the backup directory:
  5. mkdir backup
    chmod 777 backup
  6. Create a file called backup.sh and make it executable:
  7. touch backup.sh
    chmod +x backup.sh
  8. Open the file in a text editor or vim and paste in the following:
#!/bin/bash
today=$(date '+%d_%m_%y')
echo "* Performing domain backup..."
tar czf /home/00000/data/backup/example.com_"$today".tar.gz -C / home/00000/domains/example.com
# Remove backups older than 7 days:
MaxFileAge=7
find /home/00000/data/backup/ -name '*.gz' -type f -mtime +$MaxFileAge -exec rm -f {} \;
echo "* Backed up..."

This really simple script creates a tar.gz backup of the example.com directory, stamps it with the current date and places it in the the backup folder we just created. It will also prune any backup older than 7 days old. This number can be adjusted by tweaking the MaxFileAge value. Before saving the file, be sure to make the necessary adjustments for your site number after the /home directory. You'll also need to modify example.com for the actual domain you'll be backing up. Before adding it as a cron, run the script via SSH -- the output should look like this:

example.com@n10:/home/00000/data$ ./backup.sh
* Performing domain backup...
* Backed up...

Depending on the size, the backup may hesitate after the initialization while the domain directory is being compressed. You can now verify that the new backup is in the /home/#####/data/backup/ directory with the today's date: example.com_25_10_09.tar.gz. Once you've confirmed the backup works, add the backup.sh file as a cron within the AccountCenter following this KnowledgeBase article. You can schedule the cron as frequent as you would like.

Now, you've probably got at least 1 MySQL database that should be backed up as well. This can be accomplished just as easily.

SSH back into the /data directory and create a new bash script file:

touch db-backup.sh
chmod +x db-backup.sh

Now, open the file in a text editor or vim and paste in the following:

#!/bin/sh
#############################
SQLHOST="internal-db.s00000.gridserver.com"
SQLDB="db00000_dbname"
SQLUSER="db00000"
SQLPASS="db_user_password"
SQLFILE="db00000_example.com_$(date '+%d_%m_%y').sql"
LOCALBACKUPDIR="/home/00000/data/backup"
#############################
echo "* Performing SQL dump..."
cd $LOCALBACKUPDIR
mysqldump -h $SQLHOST --add-drop-table --user="$SQLUSER" --password="$SQLPASS" $SQLDB > $SQLFILE
# Remove backups older than 7 days:
MaxFileAge=7
find /home/00000/data/backup/ -name '*.sql' -type f -mtime +$MaxFileAge -exec rm -f {} \;
echo "* Backed up..."
exit 0

This bash script uses mysqldump to dump a MySQL database which stamps it with the current date and places it in the the backup folder we created before. This will also prune any database backups that are older than 7 days. Before saving the file, be sure to make the necessary adjustments for your site number and database credentials -- all of this can be found under the Manage Databases section of the AccountCenter. Before adding it as a cron, run the script via SSH -- the output should look like this:

example.com@n10:/home/00000/data$ ./db-backup.sh
* Performing SQL dump...
* Backed up...

Once you've confirmed the script completes without errors, add it as a cron job so you'll never have to worry about losing your MySQL data again!

Ditching Apple Mail for Postbox

Until recently, I've been using Apple Mail for all of my email needs. Although we typically loathe Apple Mail for it's overall bugginess in my workplace, I've been apathetic and used it for nearly 2 years. In doing so has warranted some abuse from my coworkers, all of whom swear by Thunderbird. My hesitance to switch was purely based on my willful ignorance toward other applications.

Meet Postbox.

Derived from Thunderbird and Mozilla labs, Postbox is a cross platform e-mail client that brings some of the great features from Gmail to the desktop. Features like tagging, topics, conversation view, tabs, and smart search made this an ideal replacement for Apple Mail. Setting up new accounts was also easy; if you're familiar with Thunderbird, you should feel right at home. Within an hour of installation, I had set up 4 IMAP accounts (2 Work, 2 Personal) and set up mail filters and synchronized additional IMAP folders.

I'm still running beta 10 however the newest version is said to have significantly decreased CPU and memory usage. AOL Mail account support (ugh) has been added as well. The full list of bug fixes and improvements can be found here.