Posts from 2005–2010 temporarily unavailable.

On Friday night I attended a talk by Sherry Turkle called “Reclaiming Conversation: The Power of Talk in a Digital Age”. Here are my notes.


Turkle is an anthropologist who interviews people from different generations about their communication habits. She has observed cross-generational changes thanks to (a) the proliferation of instant messaging apps such as WhatsApp and Facebook Messenger; and (b) fast web searching from smartphones.

Her main concern is that conversation is being trivialised. Consider six or seven college students eating a meal together. Turkle’s research has shown that the etiquette among such a group has shifted such that so long as at least three people are engaged in conversation, others at the table feel comfortable turning their attention to their smartphones. But then the topics of verbal conversation will tend away from serious issues – you wouldn’t talk about your mother’s recent death if anyone at the table was texting.

There are also studies that purport to show that the visibility of someone’s smartphone causes them to take a conversation less seriously. The hypothesis is that the smartphone is a reminder of all the other places they could be, instead of with the person they are with.

A related cause of the trivialisation of conversation is that people are far less willing to make themselves emotionally vulnerable by talking about serious matters. People have a high degree of control over the interactions that take place electronically (they can think about their reply for much longer, for example). Texting is not open-ended in the way a face-to-face conversation is. People are unwilling to give up this control, so they choose texting over talking.

What is the upshot of these two respects in which conversation is being trivialised? Firstly, there are psycho-social effects on individuals, because people are missing out on opportunities to build relationships. But secondly, there are political effects. Disagreeing about politics immediately makes a conversation quite serious, and people just aren’t having those conversations. This contributes to polarisation.

Note that this is quite distinct from the problems of fake news and the bubbling effects of search engine algorithms, including Facebook’s news feed. It would be much easier to tackle fake news if people talked about it with people around them who would be likely to disagree with them.

Turkle understands connection as a capacity for solitude and also for conversation. The drip feed of information from the Internet prevents us from using our capacity for solitude. But then we fail to develop a sense of self. Then when we finally do meet other people in real life, we can’t hear them because we just use them to try to establish a sense of self.

Turkle wants us to be more aware of the effects that our smartphones can have on conversations. People very rarely take their phone out during a conversation because they want to escape from that conversation. Instead, they think that the phone will contribute to that conversation, by sharing some photos, or looking up some information online. But once the phone has come out, the conversation almost always takes a turn for the worse. If we were more aware of this, we would have access to deeper interactions.

A further respect in which the importance of conversation is being downplayed is in the relationships between teachers and students. Students would prefer to get answers by e-mail than build a relationship with their professors, but of course they are expecting far too much of e-mail, which can’t teach them in the way interpersonal contact can.

All the above is, as I said, cross-generational. Something that is unique to millenials and below is that we seek validation for the way that we feel using social media. A millenial is not sure how they feel until they send a text or make a broadcast (this makes them awfully dependent on others). Older generations feel something, and then seek out social interaction (presumably to share, but not in the social media sense of ‘share’).

What does Turkle think we can do about all this? She had one positive suggestion and one negative suggestion. In response to student or colleague e-mails asking for something that ought to be discussed face-to-face, reply “I’m thinking.” And you’ll find they come to you. She doesn’t want anyone to write “empathy apps” in response to her findings. For once, more tech is definitely not the answer.

Turkle also made reference to the study reported here and here and here.

Posted Tue 07 Feb 2017 02:52:57 UTC

The Sorcerer’s Code

This is a well-written biographical piece on Stallman.

Posted Tue 31 Jan 2017 16:37:31 UTC

There have been a two long threads on the debian-devel mailing list about the representation of the changes to upstream source code made by Debian maintainers. Here are a few notes for my own reference.

I spent a lot of time defending the workflow I described in dgit-maint-merge(7) (which was inspired by this blog post). However, I came to be convinced that there is a case for a manually curated series of patches for certain classes of package. It will depend on how upstream uses git (rebasing or merging) and on whether the Debian delta from upstream is significant and/or long-standing. I still think that we should be using dgit-maint-merge(7) for leaf or near-leaf packages, because it saves so much volunteer time that can be better spent on other things.

When upstream does use a merging workflow, one advantage of the dgit-maint-merge(7) workflow is that Debian’s packaging is just another branch of development.

Now consider packages where we do want a manually curated patch series. It is very hard to represent such a series in git. The only natural way to do it is to continually rebase the patch series against an upstream branch, but public branches that get rebased are not a good idea. The solution that many people have adopted is to represent their patch series as a folder full of .diff files, and then use gbp pq to convert this into a rebasing branch. This branch is not shared. It is edited, rebased, and then converted back to the folder of .diff files, the changes to which are then committed to git.

One of the advantages of dgit is that there now exists an official, non-rebasing git history of uploads to the archive. It would be nice if we could represent curated patch series as branches in the dgit repos, rather than as folders full of .diff files. But as I just described, this is very hard. However, Ian Jackson has the beginnings of a workflow that just might fit the bill.

Posted Mon 09 Jan 2017 20:14:52 UTC

Burkeman: Why time management is ruining our lives

Over the past semester I’ve been trying to convince one graduate student and one professor in my department to use Inbox Zero to get a better handle on their e-mail inboxes. The goal is not to be more productive. The two of them get far more academic work done than I do. However, both of them are far more stressed than I am. And in the case of the graduate student, I have to add items to my own to-do list to chase up e-mails that I’ve sent him, which only spreads this stress and tension around.

The graduate student sent me this essay by Oliver Burkeman about how these techniques can backfire, creating more stress, tension and anxiety. It seems to me that this happens when we think of these techniques as having anything to do with productivity. Often people will say “use this technique and you’ll be less stressed, more productive, and even more productive because you’re less stressed.” Why not just say “use this technique and you’ll be less anxious and stressed”? This is a refusal to treat lower anxiety as merely a means to some further end. People can autonomously set their own ends, and they’ll probably do a better job of this when they’re less anxious. Someone offering a technique to help with their sense of being overwhelmed need not tell them what to do with their new calm.

It might be argued that this response to Burkeman fails to address the huge sense of obligation that an e-mail inbox can generate. Perhaps the only sane response to this infinite to-do list is to let it pile up. If we follow a technique like Inbox Zero, don’t we invest our inbox with more importance than it has? Like a lot of areas of life, the issue is that the e-mails that will advance truly valuable projects and relationships, projects of both ourselves and of others, are mixed in with reams of stuff that doesn’t matter. We face this situation whenever we go into a supermarket, or wonder what to do during an upcoming vacation. In all these situations, we have a responsibility to learn how to filter the important stuff out, just as we have a responsibility to avoid reading celebrity gossip columns when we are scanning through a newspaper. Inbox Zero is a technique to do that filtering in the case of e-mail. Just letting our inbox pile up is an abdication of responsibility, rather than an intelligent response to a piece of technology that most of the world abuses.

Posted Sat 31 Dec 2016 08:34:29 UTC

Programming by poking: why MIT stopped teaching SICP

Perhaps there is a case for CS programs keeping pace with workplace technological changes (in addition to developments in the academic field of CS), but it seems sad to deprive undergrads of deeper knowledge about language design.

Posted Sun 18 Dec 2016 21:34:15 UTC

A new postdoc student arrived at our department this semester, and after learning that he uses GNU/Linux for all his computing, I invited him along to TFUG. During some of our meetings people asked “how could I do X on my GNU/Linux desktop?” and, jokingly, the postdoc would respond “the answer to your question is ‘do you really need to do that?’” Sometimes the more experienced GNU/Linux users at the table would respond to questions by suggesting that the user should simply give up on doing X, and the postdoc would slap his thigh and laugh and say “see? I told you that’s the answer!”

The phenomenon here is that people who have at some point made a commitment to at least try to use GNU/Linux for all their computing quickly find that they have come to value using GNU/Linux more than they value engaging in certain activities that only work well/at all under a proprietary operating system. I think that this is because they get used to being treated with respect by their computer. And indeed, one of the reasons I’ve almost entirely given up on computer gaming is that computer games are non-free software. “Are you sure you need to do that?” starts sounding like a genuine question rather than simply a polite way of saying that what someone wants to do can’t be achieved.

I suggest that this is a blessing in disguise. The majority of the things that you can only do under a proprietary operating system are things that it would be good for you if you were to switch them out for other activities. I’m not suggesting that switching to a GNU/Linux is a good way to give up on the entertainment industry. It’s a good way of moderating your engagement with the entertainment industry. Rather than logging onto Netflix, you might instead pop in a DVD of a movie. You can still engage with contemporary popular culture, but the technical barriers give you an opportunity to moderate your consumption: once you’ve finished watching the movie, the software won’t try to get you to watch something else by making a calculation as to what you’re most likely to assent to watching next based on what you’ve watched before. For this behaviour of the Netflix software is just another example of non-free software working against its user’s interests: watching a movie is good for you, but binge-watching a TV series probably isn’t. In cases like this, living in the world of Free Software makes it easier to engage with media healthily.

Posted Wed 28 Sep 2016 15:25:09 UTC Tags:
Posted Wed 21 Sep 2016 00:02:42 UTC Tags:

When it rains in Tucson, people are able to take an unusually carefree attitude towards it. Although the storm is dramatic, and the amount of water means that the streets turn to rivers, everyone knows that it will be over in a few hours and the heat will return (and indeed, that’s why drain provision is so paltry).

In other words, despite the arresting thunderclaps, the weather is not threatening. By contrast, when there is a storm in Britain, one feels a faint primordial fear that one won’t be able to find shelter after the storm, in the cold and sodden woods and fields. Here, that threat just isn’t present. I think that’s what makes us feel so free to move around in the rain.

I rode my bike back from the gym in my $5 plastic shoes. The rain hitting my body was cold, but the water splashing up my legs and feet was warm thanks of the surface of the road—except for one area where the road was steep enough that the running water had already taken away all lingering heat.

Posted Wed 17 Aug 2016 03:28:13 UTC Tags:

I maintain Debian packages for several projects which are hosted on GitHub. I have a master packaging branch containing both upstream’s code, and my debian/ subdirectory containing the packaging control files. When upstream makes a new release, I simply merge their release tag into master: git merge 1.2.3 (after reviewing the diff!).

Packaging things for Debian turns out to be a great way to find small bugs that need to be fixed, and I end up forwarding a lot of patches upstream. Since the projects are on GitHub, that means forking the repo and submitting pull requests. So I end up with three remotes:

origin
the Debian git server
upstream
upstream’s GitHub repo from which I’m getting the release tags
fork
my GitHub fork of upstream’s repo, where I’m pushing bugfix branches

I can easily push individual branches to particular remotes. For example, I might say git push -u fork fix-gcc-6. However, it is also useful to have a command that pushes everything to the places it should be: pushes bugfix branches to fork, my master packaging branch to origin, and definitely doesn’t try to push anything to upstream (recently an upstream project gave me push access because I was sending so many patches, and then got a bit annoyed when I pushed a series of Debian release tags to their GitHub repo by mistake).

I spent quite a lot of time reading git-config(1) and git-push(1), and came to the conclusion that there is no combination of git settings and a push command that do the right thing in all cases. Candidates, and why they’re insufficient:

git push --all
I thought about using this with the remote.pushDefault and branch.*.pushRemote configuration options. The problem is that git push --all pushes to only one remote, and it selects it by looking at the current branch. If I ran this command for all remotes, it would push everything everywhere.
git push <remote> : for each remote
This is the “matching push strategy”. It will push all branches that already exist on the remote with the same name. So I thought about running this for each remote. The problem is that I typically have different master branchs on different remotes. The fork and upstream remotes have upstream’s master branch, and the origin remote has my packaging branch.

I wrote a perl script implementing git push-all, which does the right thing. As you will see from the description at the top of the script, it uses remote.pushDefault and branch.*.pushRemote to determine where it should push, falling back to pushing to the remote the branch is tracking. If won’t push something when all three of these are unspecified, and more generally, it won’t create new remote branches except in the case where the branch-specific setting branch.*.pushRemote has been specified. Magit makes it easy to set remote.pushDefault and branch.*.pushRemote.

I have this in my ~/.mrconfig:

git_push = git push-all

so that I can just run mr push to ensure that all of my work has been sent where it needs to be (see myrepos).

#!/usr/bin/perl

# git-push-all -- intelligently push most branches

# Copyright (C) 2016 Sean Whitton
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or (at
# your option) any later version.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.

# Prerequisites:

# The Git::Wrapper, Config::GitLike, and List::MoreUtils perl
# libraries.  On a Debian system,
#     apt-get install libgit-wrapper-perl libconfig-gitlike-perl \
#         liblist-moreutils-perl

# Description:

# This script will try to push all your branches to the places they
# should be pushed, with --follow-tags.  Specifically, for each branch,
#
# 1. If branch.pushRemote is set, push it there
#
# 2. Otherwise, if remote.pushDefault is set, push it there
#
# 3. Otherwise, if it is tracking a remote branch, push it there
#
# 4. Otherwise, exit non-zero.
#
# If a branch is tracking a remote that you cannot push to, be sure to
# set at least one of branch.pushRemote and remote.pushDefault.

use strict;
use warnings;
no warnings "experimental::smartmatch";

use Git::Wrapper;
use Config::GitLike;
use List::MoreUtils qw{ uniq apply };

my $git = Git::Wrapper->new(".");
my $config = Config::GitLike->new( confname => 'config' );
$config->load_file('.git/config');

my @branches = apply { s/[ \*]//g } $git->branch;
my @allBranches = apply { s/[ \*]//g } $git->branch({ all => 1 });
my $pushDefault = $config->get( key => "remote.pushDefault" );

my %pushes;

foreach my $branch ( @branches ) {
    my $pushRemote = $config->get( key => "branch.$branch.pushRemote" );
    my $tracking = $config->get( key => "branch.$branch.remote" );

    if ( defined $pushRemote ) {
        print "I: pushing $branch to $pushRemote (its pushRemote)\n";
        push @{ $pushes{$pushRemote} }, $branch;
    # don't push unless it already exists on the remote: this script
    # avoids creating branches
    } elsif ( defined $pushDefault
              && "remotes/$pushDefault/$branch" ~~ @allBranches ) {
        print "I: pushing $branch to $pushDefault (the remote.pushDefault)\n";
        push @{ $pushes{$pushDefault} }, $branch;
    } elsif ( !defined $pushDefault && defined $tracking ) {
        print "I: pushing $branch to $tracking (probably to its tracking branch)\n";
        push @{ $pushes{$tracking} }, $branch;
    } else {
        die "E: couldn't find anywhere to push $branch";
    }
}

foreach my $remote ( keys %pushes ) {
    my @branches = @{ $pushes{$remote} };
    system "git push --follow-tags $remote @branches";
    exit 1 if ( $? != 0 );
}
Posted Sat 06 Aug 2016 22:45:16 UTC

This doesn’t appear to cover the other kind of comment-moderation problem: that where overmoderation and attachment to poster identity leads to an environment of stifling conventionalism.

Photography communities in particular (e.g. flickr, instagram, 500px) are vulnerable to turning into circlejerks where no-one is willing to say what they mean for fear of appearing the negative nancy (no pun intended) and where high post-count contributors’ poorly-supported opinions become elevated above said views’ merits. In such communities the typical discussion is at the level of tepid platitude: “good exposure!”, “nice depth of field!”, or “cool HDR!”. On the other end of the scale there’s the imageboard style of community where anonymity is the norm, feedback is uncompromisingly harsh, and uselessly opaque criticism appears such on its face; unsuited to the overly sensitive but hideously valuable to the advancing novice.

Ordinary web forums, with tools oriented towards a punitive “he said the n-word! delete his account and everything he’s posted! persona non grata, in damnatio memoriae!” school of moderation, strongly tend to the former.

ksandstr on LWN

Posted Sat 30 Jul 2016 23:16:34 UTC