Tag: website

Dynamic DNS client for Namecheap using bash & cron

In addition to running this website, I also run a home server. For convenience, I point a subdomain of cmetcalfe.ca at it so even though it's connected using a dynamic IP (and actually seems to change fairly frequently), I can get access to it from anywhere.

As a bit of background, the domain for this website is registered and managed through Namecheap. While they do provide a recommended DDNS client for keeping a domain's DNS updated, it only runs on Windows.

Instead, after enabling DDNS for the domain and reading Namecheap's article on using the browser to update DDNS I came up with the following dns-update script.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
#!/bin/sh

# Abort if anything goes wrong (negates the need for error-checking)
set -e

# Uses drill instead of dig
resolve() {
    #dig "$1" @resolver1.opendns.com +short 2> /dev/null
    line=$(drill "$1" @resolver1.opendns.com 2> /dev/null | sed '/;;.*$/d;/^\s*$/d' | grep "$1")
    echo "$line" | head -1 | cut -f5
}

dns=$(resolve <subdomain>.cmetcalfe.ca)
curr=$(resolve myip.opendns.com)
if [ "$dns" != "$curr" ]; then
    if curl -s "https://dynamicdns.park-your-domain.com/update?host=<subdomain>&domain=cmetcalfe.ca&password=<my passkey>" | grep -q "<ErrCount>0</ErrCount>"; then
        echo "Server DNS record updated ($dns -> $curr)"
    else
        echo "Server DNS record update FAILED (tried $dns -> $curr)"
    fi
fi

It basically checks if the IP returned by a DNS query for the subdomain matches the current IP of the server (as reported by an OpenDNS resolver) and if it doesn't, sends a request to update the DNS. The echo commands are there just to output some record of the IP changing. Maybe I'll do some analysis of it at some point.

To run the script every 30 minutes and redirect any output from it to the syslog, the following crontab entry can be used:

1
*/30 * * * * /path/to/dns-update | /usr/bin/logger -t dns-update

With the script automatically running every 30 minutes I can now be confident that my subdomain will always be pointing at my home server whenever I need access to it.

Note

A previous version of this article used curl -sf http://curlmyip.com to find the server's current IP address. However, after curlmyip went down for a few days, I decided to take the advice in this StackExchange answer and use OpenDNS instead.


Changes to the blog and website

I used to run this site on two different platforms. There was a static landing page hosted on my server, and a blog hosted on Blogger.

While the landing page was reasonably nice-looking (screenshot here), the blog looked terrible. Free Blogger templates are generally not the nicest-looking things to look at and the one I used was no exception.

More importantly though, the Blogger composer made really hard to generate nice-looking articles. For just writing text posts, it wasn't bad (aside from the crazy HTML it generated), but for inserting code snippits or doing any kind of advanced formatting it was incredibly frustrating.

A few weeks ago I finally decided to deal with the situation. My general plan was to find a static site generator that allowed for writing posts in Markdown syntax and use it to create a website that would host some static content as well as the blog.

Static sites are exactly what they sound like. Just some static HTML files for a webserver to serve to clients. No fancy frameworks, databases, or server-side processing involved. This is advantageous primarily because it makes site blazing fast and very light on server resources.

After looking at a few static site generators, I decided to go with Pelican. Pelican is a Python-based static site generator that uses the awesome Jinja2 templating library and understands content written in a number of formats, including Markdown. It's extremely easy to set up and requires only a single command to regenerate the entire site.

Transferring content from the old blog into the *.md files that Pelican reads was also really simple. Pelican includes a tool called pelican-import that allows for reading in data from a variety of sources. I used this tool to pull all my previous posts down from the RSS feed of the old blog.

After fixing up the imported data (the import tool is good, but not perfect), I started looking into templates.

Like with Blogger themes, I was having a hard time finding anything I liked, until I came across a template called svbhack. It had a nice page layout and the general aesthetic was good, but needed a fair bit of tweaking. I forked the repository (+1 for open-source) and over the next few weeks used my limited HTML and CSS knowledge to transform it into the one you see today.

There are still a bunch of things I want to change/fix/add, but at this point I feel that the site is ready to be released. The code for the template and the site is all freely availible on GitHub.

If you have any comments or suggestions I'd love to hear them!


How to download an entire Google Site

When using Google Sites, there is currently no way to make a backup of your site, or download the site so you can host it on another server.

This command uses a tool called wget to spider through a website and download all the public files to the local computer. Unix users will most likely have the wget tool already installed (if not, you can install it via your preferred package manager), while Windows users can get it from here.

Once wget is installed, run it with the following parameters:

1
2
3
#Downloads all public pages on a Google Site

wget -e robots=off -m -k -K -E -rH -Dsites.google.com http://sites.google.com/a/domain/site/

This tells wget to spider through all the links on your site and download the html files and linked content (such as images). Note that pages that aren't linked from anywhere on the site won't be downloaded.

This technique will also work for websites other than the ones hosted on Google Sites.

© Carey Metcalfe. Built using Pelican. Theme is subtle by Carey Metcalfe. Based on svbhack by Giulio Fidente.