script posts

Github repo backup script

For some time, I’ve been wanting to set up a backup for my Github repos. Technically they are all backed up by my local copies, which are also backed up when I back up my local computer. However, I wanted something that was sure to have everything from all the repos (all branches, tags, etc) and could be set up and run continuously on a yet-to-be-created backup server. I have create a bash script to do this for me.

Continue reading post "Github repo backup script"

Rsync and dealing with “some files vanished” warning

I use rsync for backups, site deployments, and other purposes where I need to sync two folders. It took a little while to figure out, but has been great for those purposes since. Every once in a while, though, I run into issues with it. Recently, I set up an rsync script to back up most of the files on my entire computer. Since this takes a while and the computer is actively running during the backup, things can change while it is still running. This can lead to some errors like “rsync warning: some files vanished before they could be transferred”. Even though this is a warning, and the sync works perfectly fine, it returns a non-zero exit code. This caused my script to stop and thus the rest of the backup activity didn’t finish.

I looked for an option or simple solution to allow it to go on without complaining.

Continue reading post "Rsync and dealing with “some files vanished” warning"

Finding short TLD’s

I’ve been looking for a short domain to potentially use for permashortlinks. For a domain to be usefully short, it must have both a short TLD and short SLD. Having three characters each would make for seven total characters (including the period) for the domain. Much more than that and it starts to lose its usefulness. There are no one character TLD‘s (though they’d be great for permashortlinks). Two character TLD‘s are reserved for country codes. I’m a bit reluctant to use a code for a country I don’t live in, and the one I do disallows whois privacy. I’m a bit reluctant to decide that my address, phone number and email address will be “perma”nently available for all to see (assuming I keep the permanent promise of of permashortlinks). So three characters have been where I’ve been doing most of my looking.

There are a number of good lists of available TLD‘s. Indiewebcamp has a list of options with a brief blurb on their fitness and possible problems. It only has country code domains though. United Domains has a list with current TLD‘s and their prices plus soon to be available TLD‘s. It has a page for each with some information about the TLD and marketing-speak thoughts on uses. Name.com has a list with per-TLD pages as well that are often more brief. It’s hard to parse these lists to find just the short ones though.

I found two plain-text lists of TLD‘s (IANA’s and publicsuffix’s), which got me to thinking that I could parse these to find just the ones with three characters. I wrote a script in PHP and modified it to handle any number of characters. It looks like:

Continue reading post "Finding short TLD’s"

Check request compression savings

Gzip compression is almost universally recommended as a basic step to improving site performance. It basically uses a little bit of extra processing on the server and client to significantly reduce the transfer size of most text responses. In Apache, this is done with mod_deflate (see the H5BP config for an example of how to set this up).

A while back, I was setting gzip up on my server, and wanted a simple way to verify that it was working and check how much transfer was saved. One simple way to verify it is working is with curl on the command line. If you run curl -I -H 'Accept-Encoding: gzip,deflate' example.com and see the header Content-Encoding: gzip, compression is working. To test the transfer savings, I wrote a simple script using PHP’s curl library. It makes a request with and without the Accept-Encoding: gzip,deflate header, and compares the transfer data info provided by curl_getinfo().

Continue reading post "Check request compression savings"

Upgrading my Awstats setup

I don’t really monitor analytics for my personal sites that often besides for my blogs, for which I use wordpress.com’s analytics. I do have three open-source analytics programs set up for my main sites though: piwik, owa, and awstats. Awstats is the one I’ve tended to look at the least, probably because its interface isn’t as nice as the others and it doesn’t have as much data about visits. However, it is the only one that looks at actual server logs, so it should be the most accurate about basic visit information. The other two use JavaScript, one having an image fallback, so there’s the potential for them to miss visits.

I have my awstats set up as I described in 2010. I keep the configuration and the data separate from the install to make updates easier. However, it had been so long since I upgraded that I forgot how it was set up and fumbled a little before finding that article and figuring out what had to be done. In order to make it easier for next time, I created myself a simple little script to handle the upgrade for me:

Continue reading post "Upgrading my Awstats setup"