Posts from 2005–2010 temporarily unavailable.

On my Debian machines I run stunnel to create an secure connection to my e-mail provider’s SMTP gateway. Postfix sends mail through that TLS tunnel. Recently I stopped receiving e-mail from rss2email and I discovered tonight that the reason was that the tunnel has caved in on the machine which rss2email was running on. Unfortunately, some mail was permanently discarded from the postfix queue because it turns out that postfix will by default keep mail in the queue for a maximum of only 5 days. Since the connection to the gateway was down, postfix couldn’t return the mail to its sender (i.e. me).

Fortunately, I’m not smart enough to have any log rotation going on, so I could easily find the message that were lost:

grep "status=expired, returned to sender" /var/log/mail.log \
    | awk '{print $6}' \
    | while read id; do grep "$id" -m1 /var/log/mail.log; done

The first grep determines the queue id of the messages that were expired, and then the second grep finds the first entry in the mail log for that message, which provides the time the message was sent. Replacing -m1 with -m4 gave me the message-id of the messages and the intended recipient of the messages. This allowed me to restore them from backups or bounce them from my sent mail folder for those that I tried to send myself.

To prevent this from happening again, I’ve extended the maximum lifetime of messages in the queue from 5 days to 10:

postconf -e maximal_queue_lifetime=10d

I’ve incorporated a check for clogged mail queues on my machines into my weekly backup routine.

Posted Sun 17 Apr 2016 05:19:38 UTC Tags:

Making a big difference to my neck and back pain after just a week!

Posted Fri 25 Mar 2016 16:20:13 UTC Tags:

Last night I got back from spending around 5 days in the Bay Area for Spring Break. I stayed in a hostel in downtown SF for three nights and then I stayed with a friend who is doing a PhD at Stanford. When initially planning this trip my aim was just to visit somewhere interesting on the west coast of the continental United States. I chose the Bay Area because I wanted to get my PGP key signed by some Debian Developers and that area has a high concentration of DDs, and because I wanted to see my friend at Stanford. But in the end I liked San Francisco a lot more than expected to and am very glad that I had an opportunity to visit.

The first thing that I liked was how easy it seemed to be to find people interested in the same kind of tech stuff that I am. I spent my first afternoon in the city exploring the famous Mission district, and at one point while sitting in the original Philz Coffee I found that the person sitting next to me was running Debian on her laptop and blogs about data privacy. We had an discussion about how viable OpenPGP is as a component of a technically unsophisticated user’s attempts to stay safe online. Later that same day while riding the subway train, someone next to me fired up Emacs on their laptop. And over the course of my trip I met five Debian Developers doing all sorts of different kinds of work both in and outside of Debian, and some Debian users including one of Stanford’s UNIX sysadmins. This is a far cry from my day-to-day life down in the Sonoran Desert where new releases of iOS are all anyone seems to be interested in.

Perhaps I should have expected this before my trip, but I think I had assumed that most of the work being done in San Francisco was writing web apps, so I was pleased to find people working on the same kind of things that I am currently putting time into. And in saying the above, I don’t mean to demean the interests of the people around me in Arizona for a moment (nor those writing web apps; I’d like to learn how to write good ones at some point). I’m very grateful to be able to discuss my philosophical interests with the other graduate students. It’s just that I miss being able to discuss tech stuff. I guess you can’t have everything you want!

One particular encouraging meeting I had was with a Debian Developer employed by Google and working on Git. While my maths background sets me up with the right thinking skills to write programs, I don’t have knowledge typically gained from an education in computer science that enables one to work on the most interesting software. In particular, low-level programming in C is something that I had thought it wouldn’t be possible for me to get started with. So it was encouraging to meet the DD working on Git at Google because his situation was similar: his undergraduate background is in maths and he was able to learn how to code in C by himself and is now working on a exciting project at a company that it is hard to get hired by. I don’t mean that doing exactly what he’s doing is something that I aiming for, just that it is very encouraging to know the field is more open to me than I had thought. I was also reminded of how fortunate I am to have the Internet to learn from and projects like Debian to get involved with.

Moving on from tech, I enjoyed the streets of San Francisco, and the Stanford campus. San Francisco is fantastically multicultural though with clear class and wealth divisions. A very few minutes walk from the Twitter headquarters with its “tech bros”, as the maths PhD students I met at Stanford call them, are legions of the un- and barely-employed passing their time on the concrete. I enjoyed riding one of the old cable cars through the aesthetically revealing and stark combination of a west coast grid system on some very steep hills. I was fortunate to be able to walk across the Golden Gate Bridge on a perfectly clear and mist-free day.

Meeting people involved with Debian and meeting my old friend at Stanford had me reflecting on and questioning my life in the desert even more than usual. I try to remind myself that there is an end date in sight and I will regret spending my time here just thinking about leaving. I sometimes worry that I could easily find myself moving to the big city—London, San Francisco or elsewhere—and letting myself be carried by the imagined self-importance of that, sidelining and procrastinating things that I should prize more highly. I should remember that the world of writing software in big cities isn’t going away and my time in the desert is an opportunity to prepare myself better for that, building my resistance to being swept away by the tides of fashion.

Posted Sun 20 Mar 2016 19:57:04 UTC Tags:

There are three of us living here, and the plan is to switch all three of us from wireless to wired ethernet Internet connections by transporting cabling through the ventilation system using this remote-controlled car.

Here’s an action shot we took. My housemate put on his “Gaming” playlist and it reached a very suitable crescendo as we guided the car through its final journey, pulling the last piece of cotton which we would then use to pull the final cabal through. The timing of the music couldn’t have been better. It reflected the risk of losing control of the car as we moved the controller around to enable the car to pick up the radio signals. A further risk was caused by the car only being able to turn right, not left, and there not being enough room in the air duct for it to make a full-circle rotation. Our backup plan was to try tossing a ball tied to a piece of string down the duct.

Unfortunately I’m still on a wireless connection because to reach my room we’d have had to send the cable via the furnace/air conditioning unit which we decided against.

Does a SATA-to-USB converter, for moving data off my old laptop hard drive which I had to replace this week, need to be this complicated?

Posted Sun 14 Feb 2016 18:44:11 UTC Tags:

As I understand it, having a GitHub profile as a portfolio has become an essential element in applying for entry-level computer programming jobs—insightfully, a friend of mine draws a comparison with the rise of unpaid internships in other fields. Something about GitHub that gets in the way of maintaining a presentable portfolio is that forks of other people’s repositories made just to submit a pull request can crowd out repositories showcasing one’s work. Sometimes pull requests can take months to be responded to by upstream maintainers, leaving unimpressive repositories sitting around on one’s profile for all that time.

The following Python script,, forks a repository and then sets various attributes of it to make it as obvious as GitHub allows that it’s just a temporary fork made in order to submit a pull request. Invoke it like this:

$ upstream-owner/repo-to-fork

You will need the PyGitHub python library, which on a Debian Stretch system can be installed with apt-get install python-github.


# clean-github-pr --- Create tidy repositories for pull requests
# Copyright (C) 2016  Sean Whitton
# clean-github-pr is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
# clean-github-pr is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# GNU General Public License for more details.
# You should have received a copy of the GNU General Public License
# along with clean-github-pr.  If not, see <>.

import github

import sys
import time
import tempfile
import shutil
import subprocess
import os

CREDS_FILE = os.getenv("HOME") + "/.cache/clean-github-pr-creds"

def main():
    # check arguments
    if len(sys.argv) != 2:
        print sys.argv[0] + ": usage: " + sys.argv[0] + " USER/REPO"

    # check creds file
        f = open(CREDS_FILE, 'r')
    except IOError:
        print sys.argv[0] + ": please put your github username and password, separated by a colon, in the file ~/.cache/clean-github-pr-creds"

    # just to be sure
    os.chmod(CREDS_FILE, 0600)

    # make the fork
    creds = f.readline()
    username = creds.split(":")[0]
    pword = creds.split(":")[1].strip()

    g = github.Github(username, pword)
    u = g.get_user()

    source = sys.argv[1]
    fork = sys.argv[1].split("/")[1]
    print "forking repo " + source

    while True:
            r = u.get_repo(fork)
        except github.UnknownObjectException:
            print "still waiting"

    # set up & push github branch
    user_work_dir = os.getcwd()
    work_area = tempfile.mkdtemp()
    os.chdir(work_area)["git", "clone", "" + username + "/" + fork])
    os.chdir(work_area + "/" + fork)["git", "checkout", "--orphan", "github"])["git", "rm", "-rf", "."])
    with open("", 'w') as f:
        f.write("This repository is just a fork made in order to submit a pull request; please ignore.")["git", "add", ""])["git", "commit", "-m", "fork for a pull request; please ignore"])["git", "push", "origin", "github"])

    # set clean repository settings
           description="Fork for a pull request; please ignore",

if __name__ == "__main__":

If you have any suggestions for, please send me a patch or a pull request against the version in my dotfiles repository.

Posted Sun 31 Jan 2016 23:15:18 UTC Tags:

Over the past two months or so I have become a contributor to the Debian Project. This is something that I’ve wanted to do for a while. Firstly, just because I’ve got so much out of Debian over the last five or six years—both as a day-to-day operating system and a place to learn about computing—and I wanted to contribute something back. And secondly, in following the work of Joey Hess for the past three or four years I’ve come to share various technical and social values with Debian. Of course, I’ve long valued the project of making it possible for people to run their computers entirely on Free Software, but more recently I’ve come to appreciate how Debian’s mature technical and social infrastructure makes it possible for a large number of people to work together to produce and maintain high quality packages. The end result is that the work of making a powerful software package work well with other packages on a Debian system is carried out by one person or a small team, and then as many users who want to make use of that software need only apt-get it. It’s hard to get the systems and processes to make this possible right, especially without a team being paid full-time to set it all up. Debian has managed it on the backs of volunteers. That’s something I want to be a part of.

So far, most of my efforts have been confined to packaging addons for the Emacs text editor and the Firefox web browser. Debian has common frameworks for packaging these and lots of scripts that make it pretty easy to produce new packages (I did one yesterday in about 30 minutes). It’s valuable to package these addons because there are a great many advantages for a user in obtaining them from their local Debian mirror rather than downloading them from the de facto Emacs addons repository or the Mozilla addons site. Users know that trusted Debian project volunteers have reviewed the software—I cannot yet upload my packages to the Debian archive by myself—and the whole Debian infrastructure for reporting and classifying bugs can be brought to bear. The quality assurance standards built into these processes are higher than your average addon author’s, not that I mean to suggest anything about authors of the particular addons I’ve packaged so far. And automating the installation of such addons is easy as there are all sorts of tools to automate installations of Debian systems and package sets.

I hope that I can expand my work beyond packaging Emacs and Firefox addons in the future. It’s been great, though, to build my general knowledge of the Debian toolchain and the project’s social organisation while working on something that is both relatively simple and valuable to package. Now I said at the beginning of this post that it was following the work of Joey Hess that brought me to Debian development. One thing that worries me about becoming involved in more contentious parts of the Debian project is the dysfunction that he saw in the Debian decision-making process, dysfunction which eventually led to his resignation from the project in 2014. I hope that I can avoid getting quagmired and demotivated.

Posted Thu 28 Jan 2016 19:00:28 UTC

I saw Star Wars: The Force Awakens this evening. I was disappointed. I think that’s because for me, Star Wars is all about the Jedi, and the ethical struggles inside individuals. All the old films invite us to consider the extent to which the Jedi embody virtue. Luke saves the day by ignoring the advice of the old masters not to be hasty, but on the other hand their advice to trust in his link to the Force and turn off the targetting computer turns out to be vital. The Anakin Skywalker trilogy considered whether the Jedi Order was ossified and arrogant, admittedly not very well. The two Knights of the Old Republic video games did better by exploring the Sith and the possibility of “Grey Jedi”.

This first film of a new trilogy didn’t have anything like this, aside from the interesting suggestion that a need for belonging might be soothed by nurturing one’s connection to the Force. We just got set up for episode eight. The action didn’t matter because it wasn’t part of a story about how people should be.

Posted Sat 26 Dec 2015 23:08:56 UTC


It turns out that the Emacs package management system, package.el, doesn’t perform SSL certificate verification without some fairly involved wrangling. My Emacs configuration is something that I want to be able to clone and run on systems where it might be a real pain to perform the wrangling needed to ensure packages may be downloaded securely over encrypted HTTP.

Another issue with downloading packages from MELPA, the most popular repository for package.el, is that some packages are pulled into that repository from the EmacsWiki over unencrypted HTTP.

A further problem with MELPA is that it moves very fast, and new versions of packages that are not compatible with each other or perhaps your configuration means that you can find yourself with a broken editor in the middle of trying to get work done. To deal with this issue there is MELPA Stable, which contains hopefully-stable releases of packages that are more likely to be compatible with other packages. The problem is that many packages are in MELPA but not MELPA Stable because the author has not tagged any releases, and of the packages that are in MELPA Stable, many require newer versions of their dependencies than the versions of those dependencies available in MELPA Stable.

In short, package.el and MELPA are not dpkg, apt and the Debian Stable archive. Hopefully someday they will be. But for the moment, I don’t want to manage my Emacs packages this way.

Managing packages as git subtrees

An alternative is to manage package repositories as git subtrees. Assuming that your ~/.emacs.d/ is kept in a git repository, we can run

$ cd ~/.emacs.d
$ git subtree add --squash -P pkg/magit 2.3.0

and then Magit becomes available in ~/.emacs.d/pkg/magit. The following lisp will add all the dirs ~/.emacs.d/pkg/* and ~/.emacs.d/pkg/*/lisp to your load-path; you can modify this by changing the variable globs:

;;;; ---- package management ----

;; be sure not to load stale bytecode-compiled lisp
(setq load-prefer-newer t)

;; this is where all subtree packages are
(defconst emacs-pkg-dir (concat user-emacs-directory "pkg"))

;; load up f, and its dependencies s and dash, so we can use `f-glob'
;; and `f-join'
(dolist (pkg '("f.el" "dash.el" "s.el"))
  (add-to-list 'load-path (concat emacs-pkg-dir "/" pkg)))
(require 'f) (require 's) (require 'dash)

;; helper function
(defun expand-all-globs (root globs)
  (let ((do-glob (lambda (glob) (f-glob (f-join root glob)))))
    (apply 'nconc (mapcar do-glob globs))))

;; now add all my pkg lisp directories
(let* ((globs '("*" "*/lisp"))
       (dirs (expand-all-globs emacs-pkg-dir globs)))
  (dolist (dir dirs)
    (when (file-directory-p dir)
      (add-to-list 'load-path dir))))

;; finally put my own site-lisp at the front of `load-path'
(add-to-list 'load-path (concat user-emacs-directory "site-lisp"))

;; we will use use-package to load everything else
(require 'use-package)

When you want to update to a new version of a package,

$ cd ~/.emacs.d
$ git subtree pull --squash -P pkg/magit 2.3.1


  • This commits the source code of Magit to your ~/.emacs.d/ git repository. So when you clone your config to a new machine, all your packages will already be there and Emacs won’t have to download them (potentially insecurely).
  • There’s no dependency management. You’ll have to add subtrees for every dependency. At present, if you don’t update your packages often, this is not too onerous.
  • You should run C-u 0 M-x byte-recompile-directory ~/.emacs.d/pkg RET periodically (normally, package.el would do this for you).

Shell script

Here is a shell script to reduce typing in adding and updating subtrees. It also logs git repository clone URIs and versions fetched to a file ~/.emacs.d/pkg/subtrees so that you can find the URI to use when you want to do an update:

$ cat ~/.emacs.d/pkg/subtrees 2.3.1 v0.6.1

Use it like this:

$ emacs-pkg-subtree add 2.3.0
$ emacs-pkg-subtree pull 2.3.1

# emacs-pkg-subtree --- manage Emacs packages as git subtrees in your dotfiles git repo

# Author/maintainer    : Sean Whitton <spwhitton //ANTI-SPAM \\>
# Instructions for use :

# Copyright (C) 2015  Sean Whitton.  Released under the GNU GPL 3.


set -e

if [ "$3" = "" ]; then
    echo "$(basename $0): usage: $(basename $0) add|pull git_clone_uri ref" >&2
    exit 1

cd $DEST

repo="$(basename $2)"
top="$(git rev-parse --show-toplevel)"

cd $top
clean="$(git status --porcelain)"
if [ ! -z "$clean" ]; then
    echo "commit first" >&2
    exit 1

if [ "$op" = "add" ]; then
    if [ ! -e "$DEST/$pkg" ]; then
        git subtree add --squash --prefix $prefix $uri $ref
        echo "$uri $ref" >> $DEST/subtrees
        git add $DEST/subtrees
        git commit -m "updated Emacs packages record"
        echo "you already have a subtree by that name" >&2
        exit 1
elif [ "$op" = "pull" ]; then
    git subtree pull --squash --prefix $prefix $uri $ref
    sed -i -e "s|^${uri} .*$|${uri} ${ref}|" $DEST/subtrees
    git add $DEST/subtrees
    git commit -m "updated Emacs packages record"
    echo "$(basename $0): usage: $(basename $0) add|pull git_clone_uri ref" >&2
    exit 1
Posted Thu 26 Nov 2015 23:47:27 UTC Tags:

Something that I really enjoy is making old hardware far more useful than new, off-the-shelf laptops and workstations by means of carefully configured installations of Debian Stable. To me there is great aesthetic value in making a system useful in this way, that puts to shame the aesthetic values the computer industry would have us adopt: faster, shinier, newer.

Here is my desk here in Tucson. I arrived with my 2009 laptop, which you can see on the right. I bought this Amazon Basics keyboard for a few dollars. And my housemate gave me use of the mouse and monitor. The pillows are pile of books are a new attempt to improve the ergonomic situation. I can get so much done at this workstation!

When I video call my girlfriend over in South Korea, we like to watch films together and sometimes play Minecraft. And for simply calling, watching videos and playing Minecraft she is continually running into problems with her fancy laptop which is only a couple of years old and has hardware much more capable than my laptop. Why is it that Windows installations slow down so dramatically over time? I’ve been told that it’s partly a result of the NTFS file system not scaling well. She’s planning to take it to a Samsung shop where they’ll soup it up for her. Of course it’ll still be less useful than my Debian installation is.

Back in Europe, my grandfather is using an ancient laptop from the mid-2000s to read and write e-mails and browse the web. On that machine I set up Debian with the LXDE desktop environment. He collects his e-mail by the POP3 protocol, and reads it in Mozilla Thunderbird. In the UK, my mother is using my desktop workstation that I built for myself back in 2008 when I played a lot of video games. I added a second hard drive for her and installed Debian with the Cinnamon desktop environment. She connects into her university’s LAN using a remote desktop client.

The most fun thing about this is the software I use to manage these configurations: Propellor. My computer in Tucson, the ancient laptop deep in the Limousin, and the workstation in my parent’s house are all configured to periodically pull from a central git repository which contains a statement of how they should all be configured (stated in fairly terse Haskell). (The host of this central git repo also updates its own configuration in the same way. And this host can be a VPS provided by any standard VPS provider because the configuration as a git host can be very quickly applied to a fresh host.) I cryptographically sign commits to the git repo and once this signature is verified, new configuration directives are applied. Here is an (edited) sample of the full configuration:

-- #### Sean's desktop workstation

zephyrSda :: Host
zephyrSda = spwWorkstation "" "i386"
    & "/etc/timezone" `File.hasContent` ["Europe/London"]

    -- Default to booting from sdb.
    & "/etc/default/grub" `File.containsLine` "GRUB_DEFAULT=2"
    `onChange` cmdProperty "update-grub" []
    `describe` "GRUB default entry /dev/sdb"

    -- SSH in only from LAN
    & "/etc/hosts.allow" `File.containsLine` "sshd:"
    `describe` "ssh login permitted from LAN"
    & "/etc/hosts.allow" `File.containsLine` "sshd: localhost"
    `describe` "ssh login permitted from localhost"
    & "/etc/hosts.deny" `File.containsLine` "sshd: ALL"
    `describe` "ssh login denied from elsewhere"

-- #### Sean's desktop workstation, secondary HDD

zephyrSdb :: Host
zephyrSdb = spwMachine "zephyr.abbeydaled.local" (Stable "jessie") "i386"
    & "/etc/timezone" `File.hasContent` ["Europe/London"]
    & User.accountFor (User "hschra")
    & LightDM.autoLogin (User "hschra")
    & userStandardGroups (User "hschra")

-- #### Sean's laptop

artemis :: Host
artemis = spwWorkstation "" "i386"
    & "/etc/timezone" `File.hasContent` ["America/Phoenix"]

-- #### Old laptop deep in the Limousin
-- Under Wheezy, quentin was known as 'quentinou'

quentin :: Host
quentin = spwMachine "" (Stable "jessie") "i386"
    & "/etc/timezone" `File.hasContent` ["Europe/Paris"]
    & User.accountFor (User "pawhitton")
    & userStandardGroups (User "pawhitton")
    & LightDM.autoLogin (User "pawhitton")
    & quietGrub
    & Apt.installed ["icedove"]

A few years ago I made an effort to dismantle my little empire of computers all talking to each other because I was sinking too much time into maintaining and fiddling with their configurations. I moved all my stuff to a hosting provider as part of this effort. Now I’ve gone back the other way, but with a new purpose in mind. It’s about the lessons that can be learnt and taught by doing more with less in the face of contemporary consumerism.

Posted Tue 24 Nov 2015 18:48:43 UTC


My girlfriend took this photo of a striking tree in Seoul yesterday.

Posted Tue 17 Nov 2015 22:40:11 UTC Tags: