Handling High Email Volume with sup

Over the last year, the openstack-dev mailing list has averaged
2500 messages every month. Staying on top of that much email can be
challenging, especially with some of the consumer-grade email clients
available today. I’ve recently upgrade my email setup to use sup, a
terminal-based mail client that is helping me process the mailing
list, and even keep up with gerrit at the same time.

The Story Gets Worse Before It Gets Better

Last summer I moved all of my email processing from Google’s GMail
service to an account under my own domain hosted by FastMail. I liked
GMail’s web interface, but I had grown tired of some incompatibilities
with backup tools and other services. FastMail’s standard IMAP servers
fixed those issues, and I have been happy with the service.

This fall, however, after upgrading my laptop to Yosemite I started
seeing issues with mailing list threads being improperly combined so
that completely unrelated conversations all appear to be part of the
same thread. I traced the problem, thanks to several other Yosemite
users I know, to issues with Mail.app, the MUA that ships with
OS X. At first the problem was isolated to just a few threads. Then it
expanded to a lot of my mailing lists, and finally it started
affecting my inbox. There doesn’t seem to be any way to prevent Mail
from getting confused, or to fix it once it is.

I decided I needed a new mail client, and after reviewing the options
I decided I wasn’t going to find one that met all of my needs. The
problem first showed up with mailing lists, and while not all of the
poorly-threaded messages were from lists all of the threads involved did
include list messages. I tend to treat mailing lists differently that
my regular mail anyway, automatically filing it into folders using
mail filters and the batch reading it. So I decided to set up a second
email client just for reading mailing lists.

Looking At Options

One option I considered was going back to using GMail for the mailing
lists. I haven’t completely ruled this out, but I like having all of
my mail in one account so I don’t have to remember which one to use
when sending messages. I also don’t want messages sent directly to me
to end up in a never-never land by dropping them in an inbox I don’t
check frequently.

I used the FastMail web UI for a few months while I researched, and
it’s not terrible, but I felt I could be doing better. The key binding
support isn’t complete, so I do have to mix keyboard and mouse
commands. That left me looking for other desktop clients.

Several community members recommended mutt, but it looked more complex
than I needed. If I was going to move all of my email to a new
client, mutt would look like a better option. Since I am only reading
the mailing list this way, it felt like overkill.

A couple of people I talked to used sup, which is a terminal-based
app that has the same “archive and forget” behaviors of GMail. I like
that workflow, and the relative simplicity of the most basic setup, so
I spent some time this week configuring it to run on my laptop. After
a couple of false starts, mostly because I’m on OS X and the most
recent instructions are for Linux, I have it working and have been
using it for a few days. I’m already happier with the email workflow,
so I’m documenting the steps I went through to set it up in case
someone else wants to give it a try.

Set up offlineimap

The first step for most of the terminal-based clients is to install a
program to sync all of your email from the mail server to local
files. Desktop clients like Mail.app include this feature as a
built-in, but most terminal apps don’t. Offlineimap is a good
solution for syncing both directions, so that messages you read
locally are marked as read on the server as well.

Homebrew includes a package, so the first step is to install the code
through brew:

$ brew update
$ brew install offlineimap

The next step is to set up the configuration file. The documentation
on the project site gives a complete description of all of the
options, including setting up accounts and repositories. I’ll just
focus on a few of the unusual values I changed.

I already have the mail server configured to sort messages into
mailboxes based on the mailing list id, all under an OpenStack
parent folder. I also have some OpenStack mail archive folders there
for messages that didn’t go through a mailing list. I only want those
folders synced, so I set folderfilter to include those and my sent
mail box.

folderfilter = lambda foldername: foldername.startswith('INBOX.OpenStack') or foldername in ['INBOX.Sent Items']

I had trouble adding the resulting maildirs to sup because of the
spaces in the names, so I also added a name translation function to
nametrans to remove the spaces. While I was at it, I removed the
INBOX. prefix.

nametrans = lambda foldername: re.sub('^INBOX.', '', foldername).replace(' ', '_')

The result is a copy of each of my mailboxes saved under ~/Mail in
maildir format. Using maildir instead of mbox means both sup and
offlineimap can modify different messages in the same mailbox without
conflict. It’s also faster to sync the messages than mbox is.

The next step was to run offlineimap to have it download all of my
current messages. This took a while, since I have about 6 months of
list traffic archived (another nice feature of FastMail is the
auto-purge setting on each mailbox, which means I don’t have to delete
old list messages myself).

Doing the initial sync manually will help uncover connection issues or
other configuration problems, but you don’t want to do it by hand
every time you run it. Instead, you want to set up a cron or launchctl
job to sync your mail regularly. The Homebrew package for offlineimap
includes instructions for setting up a launchctl job, so that’s what
I did.

$ ln -sfv /usr/local/opt/offline-imap/*.plist ~/Library/LaunchAgents
$ launchctl load ~/Library/LaunchAgents/homebrew.mxcl.offline-imap.plist

Set up msmtp

The IMAP protocol is useful for reading messages and accessing mail
boxes on a server, but to send messages you need an SMTP client. The
sup wiki explains how to configure msmtp, so that’s what I went with.

$ brew install msmtp

FastMail uses authentication for its SMTP servers, so only customers
can use them to send mail directly. msmtp was happy to access my
keychain for credentials, so I didn’t have to include my credentials
in clear-text in that file. The only tricky part of setting that up is
the msmtp docs were not quite right about how to create the keychain
entry. The account entry in the keychain should match the user
entry from the msmtp configuration file.

After configuring, I used this command to test:

$ echo test | Mail -s "test on $(date)” my-address@here

I did find that the mail server would deliver some of the messages to
individual addresses, but not to the mailing list. I’m not sure if the
list’s spam filter was blocking them, or if it was happening at
FastMail. I found that setting maildomain to my domain fixed the
problem.

maildomain     doughellmann.com

Set up sup

sup is written in Ruby, so I had to install Ruby and a few other
related tools before it would work. I’m not that familiar with any of
the tools, so I followed instructions I found in the sup
documentation for installing on OS X. It’s possible there is a
simpler way to do this part.

First, install a recent version of ncurses with wide character
support:

$ brew tap homebrew/dupes
$ brew install ncurses
$ brew doctor
$ brew link ncurses --force

Next, following https://rvm.io/rvm/install, install rvm and use it
to install ruby:

$ curl -sSL https://get.rvm.io | bash -s stable --ruby
$ rvm install 2.2.0
$ rvm use 2.2.0

Now it’s possible to install the gems needed by sup. Installing them
all together failed, but installing a couple of the dependencies first
and then installing sup worked fine.

$ gem install xapian-ruby
$ gem install gpgme
$ gem install sup

At this point all of the software is installed, so the next step is to
configure sup to tell it where your email is and how to work with
it. The sup-config command offers a step-by-step setup guide. I
used that for most of the settings, but did not add the mail sources
(more on why in a bit).

$ sup-config

After the basic configuration was done, I edited
~/.sup/config.yaml so that sup would save my outgoing mail to the
Sent_Items mailbox managed by offlineimap and so it would update the
flags on messages (to indicate they had been read, for example) in the
source maildirs. I also updated the sendmail command for the account
to point to msmtp.

:sent_source: maildir:/Users/dhellmann/Mail/Sent_Items
:sync_back_to_maildir: true
:accounts:
  :default:
    :name: Doug Hellmann
    :email: doug@doughellmann.com
    :sendmail: "msmtp -t --read-envelope-from"
    :signature: "/Users/dhellmann/.signature"
    :gpgkey: ''

Next, I used sup-add to add the mail being downloaded by
offlineimap. First I created the source for the sent folder, using
a special option to tell sup not to add messages from that folder to
my inbox:

$ sup-add --archive maildir:/Users/dhellmann/Mail/Sent_Items

Then I added the other mailboxes without the –archive flag,
giving some of them a flag to automatically add labels to the messages
in that mailbox. For example, the mailbox with all of the messages
from gerrit was added with:

$ sup-add -l code-review maildir:/Users/dhellmann/Mail/OpenStack.Code_Review

After all of the mailboxes were added, I ran sup-sync-back-maildir
to reassure sup that it should sync the status of messages back to the
maildir copy, so offlineimap could then update their status on the
mail server.

$ sup-sync-back-maildir -u

And finally I imported all of the messages into sup’s index using
sup-sync. Because all of the messages were read and archived
already, I used the –read and –archive options to set them
that way in the index as well (otherwise I would have had something
like 25,000 unread threads in my inbox).

$ sup-sync --read --archive

Now running sup gives a view of the unprocessed messages. I can
navigate through them using vi keystrokes, kill uninteresting threads,
archive read messages, and reply as needed. See the sup home page for
screenshots and documentation about navigation commands.

Results

Even though all of the messages are separated into different folders
on the mail server, sup shows them all as part of one inbox. This lets
me pay attention to bug report emails and gerrit notifications again,
which I had been ignoring because of the volume. Since they’re
trickling in a few at a time, and interleaved between other messages,
I’m finding it easier to keep up with them. If that changes, I may
drop those folders from my offlineimap configuration and go back to
looking for updates online separately.

Astronomy Picture of the Day 3.4

The AstronomyPictureOfTheDay desktop wallpaper updater is a Mac OS
X application that downloads the latest image from the Astronomy
Picture of the Day site
and sets it as your desktop background. It
can be run by hand, or installed to run automatically through iCal.

What’s New in this Release?

This version fixes a problem with APOD pages that link to multiple
images so that only images on the APOD website are downloaded.

Download

Visit the project page to download the latest version.

Astronomy Picture of the Day 3.3

The AstronomyPictureOfTheDay desktop wallpaper updater is a Mac OS
X application that downloads the latest image from the Astronomy
Picture of the Day site
and sets it as your desktop background. It
can be run by hand, or installed to run automatically through iCal.

What’s New in this Release?

This version fixes a problem with APOD pages that link to multiple
images so that only the main image on the page is downloaded.

Download

Visit the project page to download the latest version.

iTunes 9.1 Smart Playlist Order Fix

Today I reset my 30GB video iPod and all of the smart playlists broke.
Some were sorted in the wrong order (by name), some had no contents, and
some were missing entirely.

I found a few references to the same problem:

None of those solutions worked for me. What did seem to work was to
set iTunes to sync all copies of all podcasts (the default was to sync
only unplayed).

you gotta love backwards compatibility

A friend of mine recently found an old floppy disk created under OS 8
or 9 in the early 1990’s. There was a letter on the disk that she wanted
copied off, but she doesn’t have a Mac any more.

No problem, I figured. I did a little research and found hfsutils,
and thought all I would need to do is stick the disk in my Linux box and
grab the files. No such luck, since I don’t have a floppy drive in any
of my systems
.

After a little more thought, I remembered the old rawwrite.exe
utility for creating bootable floppies under DOS. Sure enough, there’s a
rawread.exe and I was able to make an image of the floppy disk on a
Windows box.

Although the next step was going to be to copy that image file to a
Linux system to try to mount it, I decided to try to open it on my Mac
(running Leopard) first, just for grins. I renamed the file to end .img,
and it mounted right up. All of her files were there and Finder even
acknowledged the layout of the icons in the folders.

Of course, the files themselves were created with some version of
WordPerfect that no longer exists, so our data recovery efforts only
went so far as to get the text of the letter without its formatting. I’m
still impressed that a modern Mac that doesn’t even have a floppy drive
could open the old disk image to begin with.

Automatically back up thumb drives on your Mac

I have a couple of different thumb drives that I use as portable
working devices. The data on them is important, so I wanted to back them
up. Today I worked out how to copy the contents of the USB drive to a
folder on my hard drive every time the USB drive is inserted into the
computer.

The two technologies I used to accomplish this are Folder Actions
and AppleScript. The first step is to use ScriptEditor to save the
script below to
~/Library/Scripts/Folder Action Scripts/SyncThumbDrive.scpt.

Next, enable folder actions using the “Folder Actions Setup”
application. Add the script above as an action on the “/Volumes”
folder. Once you have done that, any time a file or directory is
added to “/Volumes” the script will be invoked. Since a new entry is
added for each volume mounted automatically, this amounts to
triggering the script every time a volume is mounted.

The script looks for a destination directory
~/Documents/ThumbDrives/$VOLUME where $VOLUME is the name of
the drive inserted. You need to create the directory before inserting
the drive because the script uses the presence of the directory as
confirmation that it should copy the updates to the files on the thumb
drive over to the hard drive.

After you create the destination directory, insert the thumb drive.
When the backup is complete, the computer will play your configured
alert sound.

No files are ever deleted, so if you have removed old files from the
hard drive they will re-appear until you remove them from the thumb
drive.

installing GNU gettext for use with Python on OS X

I’ve been working on my blog post about Python’s gettext module for
the past couple of mornings, and ran into a snag. The documentation
claims that the Python source distribution includes all the tools you’ll
need, but when I got to the point where I wanted to write examples of
internationalizing plural strings, pygettext.py wasn’t working.

It worked great for extracting individual string messages up to that
point, but refused to extract messages wrapped in the ungettext() call,
even when I used what seemed to be the appropriate command line options.
I ended up installing the GNU gettext tools and using xgettext instead.
Installation took me longer than I expected, so I’m documenting the
process I went through here.

Fink or MacPorts: Not so much.

Since OS X doesn’t ship with a version of gettext by default, and
there doesn’t seem to be one in the Xcode package (Apple has their own
internationalization tools), I needed to find a copy elsewhere.

Ages ago I had installed fink as an “easy” way to grab copies of
these sorts of utilities. However, it seems that somewhere along the
line the version of fink I have stopped working (probably due to
upgrading to 10.5, I haven’t tried using fink or FinkCommander directly
in some time). After wiping and reinstalling, I was pleased to find a
slightly old version (0.14.5) of gettext installed as part of the
default set of packages. Unfortunately, xgettext wasn’t included in the
package at all.

Next, I grabbed a copy of MacPorts, a competitor to fink. Although
I’ve been warned off of MacPorts by a few people I trust, others have
had no problems with it. Installation was fairly easy, and as with fink
it installed everything to its own directory tree (under
/opt/local). Once I had the port program installed, the next step
was to run:

$ sudo port install gettext

It downloaded several dependencies, patched the source, compiled
everything, and installed it. Voila! Well, not so much.

Even though the most current version of gettext (0.17) was installed,
and the documentation clearly described the included Python language
support, the binary refused to recognize any language other than C.

Scratch that.

Compiling from Source: Partial Success

Since MacPorts had to download the package and compile it anyway, I
decided to go ahead and do that on my own. I was a little wary because I
wasn’t exactly sure what port was doing in its “patching” step, but I
thought I would give it a try anyway. I snagged the most recent tarball
from the gettext site and ran the usual

$ ./configure$ make

That gave me a binary for xgettext inside the source directory, and
testing it against my Python source yielded the results I wanted. A
simple .pot file was extracted with the original message strings and
placeholders for singular and plural translations.

Next, I thought I’d get clever and install the results into the
virtualenv I use for working on PyMOTW. Re-configuring with my –prefix
set to $VIRTUAL_ENV, rebuilding, then running make install copied
the binaries and a bunch of associated data files right where I expected
them to be. And the binary only recognized the C language.

After a little more fighting, I did manage to get it working by
installing with a prefix of $VIRTUAL_ENV/gettext and adding
$VIRTUAL_ENV/gettext/bin to my PATH. I’m not sure if the problem was
solved by clearing out an older, bad version of xgettext from elsewhere
in my path or if using $VIRTUAL_ENV as the prefix somehow confused the
install script.

Conclusion: I think I understand why internationalization is
frequently the last feature dealt with in a project.

Python Documentation Power-User Tip

If you find yourself referencing the Python standard library
documentation a lot while you’re programming, you should set up a
keyword bookmark in Firefox. I haven’t seen this feature talked about
very much, so maybe everyone just knows about it, but I find it saves me
a ton of time so I wanted to share.

Keyword Bookmarks:

Keywords bookmarks are just like regular bookmarks, but have a short
identifying word associated with them. Instead of hunting through your
bookmark list, you can just type the word into the Firefox URL field at
the top of your window.

Here’s a regular bookmark to the module index of the standard library:

standard bookmark

If I add “modules” to the keywords field, like this:

keyword bookmark

then when I type “modules” into the URL field, Firefox takes me
to http://docs.python.org/lib/modindex.html. No more hunting around in
my bookmarks!

Smart Keyword:

Adding the keyword is only the first step. It’s also easy to set up a
smart keyword (a keyword bookmark that takes an argument) and then
provide that argument when you use the keyword. It’s almost like having
a command line for the web right in your browser. Here’s how you do it:

  1. Bookmark a sample page, such as http://docs.python.org/lib/module-compiler.html.
  2. Edit the properties for the bookmark.
  3. Add a keyword, such as “pydoc”.
  4. Replace “compiler” with “%s”:

keyword bookmark with argument

  1. Save the changes.

Now when you type something like “pydoc compiler” in the URL
bar, the browser will go directly to the doc page for that module.

Quicksilver:

If you are on a Mac, Firefox keyword bookmarks also work with
Quicksilver.

Regular keyword bookmarks show up in Quicksilver searches, so you can
type Cmd-Space, “modules”, Return and Firefox opens the module
index. If you use the “pydoc” keyword, Quicksilver will prompt you
for the argument before launching the browser. So using the bookmark we
created above, a documentation lookup is:

Cmd-Space, “pydoc

quicksilver keyword bookmark

Return, “compiler”,

quicksilver keyword bookmark argument

Return, and wait for the new browser window to show the
documentation.

Django with PostgreSQL on Mac OS X Leopard

Previously, I discussed the steps I went through to get PostgreSQL
working on Tiger
. This weekend I upgraded my system to a new MacBook
Pro running Leopard.

PPC -> x86

Although the Migration Assistant copied the version of PostgreSQL I
had previously installed to the new machine, the results didn’t work
because the service would not start correctly. I ended up reinstalling
using the Unified Installer from PostgreSQL for Mac, and the server
still wouldn’t start. I deleted the old database and re-initialized it
(thanks to hints in some instructions from Russell Brooks) and that
took care of the problem. I’m not sure if there was any way for me to
convert the data, but I didn’t have anything important in the database
that I can’t re-create, so it’s fine to start from scratch.

X Code

The next step was to install X Code 3. That was easy, with the package
installer from Apple.

psycopg

And then I went back to battle with my old nemesis psycopg. I should
probably have taken Steve Holden’s advice in the comments on my earlier
post and just used psycopg2 instead. I didn’t because that would have
meant upgrading my production server, too, since the whole point of
using PostgreSQL instead of SQLite is the back-end adapters in django
produce different SQL for the same QuerySet.

To configure psycopg, I had to set LDFLAGS to include the directory
with the crt1.10.5.0 library. It’s installed to what looks like
should be a standard library directory for the X Code gcc, but ld
couldn’t find it during the “create an executable” test.

Then when running make I had the same problem I had seen under
PPC:

$ make
gcc  -DNDEBUG -g -O3  -I/Users/dhellmann/Devel/AthensDocket/bin/../include/python2.5
-I/Users/dhellmann/Devel/AthensDocket/bin/../lib/python2.5/config -DPACKAGE_NAME="psycopg"
-DPACKAGE_TARNAME="psycopg" -DPACKAGE_VERSION="1.1.21" -DPACKAGE_STRING="psycopg 1.1.21"
-DPACKAGE_BUGREPORT="psycopg@lists.initd.org" -DHAVE_LIBCRYPTO=1 -DHAVE_ASPRINTF=1
-I/Library/PostgreSQL8/include -I/Library/PostgreSQL8/include/postgresql/server
-I../egenix-mx-base-3.0.0/mx/DateTime/mxDateTime -DHAVE_PQFREENOTIFY -DNDEBUG -D_REENTRANT
-D_GNU_SOURCE -DPOSTGRESQL_MAJOR=8 -DPOSTGRESQL_MINOR=2   -c ././module.c -o ./module.o
In file included from ././module.c:33:
././module.h:30:20: error: Python.h: No such file or directory

I thought this was related to using virtualenv, but it didn’t work
correctly outside of the virtualenv setting this time (for some reason,
it did on the PPC laptop). It turns out the error message was correct
and configure/gcc just couldn’t tell where the Python headers were. The
configure command that let me compile was:

$ CPPFLAGS=-I/Library/Frameworks/Python.framework/Headers 
> LDFLAGS=-L/Developer/SDKs/MacOSX10.5.sdk/usr/lib 
> ./configure --with-postgres-libraries=/Library/PostgreSQL8/lib 
> --with-postgres-includes=/Library/PostgreSQL8/include 
> --with-mxdatetime-includes=../egenix-mx-base-3.0.0/mx/DateTime/mxDateTime

Then, oddly enough, when I did my make install step, the
psycopg.so was copied to $VIRTUAL_ENV/bin instead of
$VIRTUAL_ENV/lib/python2.5/site-packages. That was easy enough to
solve by moving the file manually, and then I was able to import the
psycopg module.

So, after about an hour, I’m back to being able to develop with django
and PostgreSQL on OS X Leopard. Maybe now I can start enjoying some of
the new features!

Django with PostgreSQL on Mac OS X Tiger

Most of the time, when I have worked on django projects, I’ve used the
SQLite backend for development on my PowerBook and deployed to
PostgreSQL on my Linux server. For the project I’m working on right now,
though, that turned into an issue when some of the queries that ran fine
on my dev system didn’t work at all on the production box. Apparently
the backend code responsible for assembling the SQL query strings was
producing different text for SQLite and PostgreSQL. So I could avoid
similar issues in the future, I set out to install PostgreSQL on my
laptop today.

Installing PostgreSQL itself turned out to be very easy indeed. I
downloaded the universal installer from Andy Satori’s “PostgreSQL for
Mac
” site. Some of the GUI clients included don’t work because I’m
on a PPC PowerBook instead of an x86 MacBook or MacBook Pro, but that’s
OK. I can use the CLI tools, which work fine.

The next thing I needed to do was set up psycopg. That turned out
to be a bit of an issue, since initd.org is having some sort of server
problem on their site. I was eventually able to download the tarball
with the sources for psycopg 1.1.21.

In order to compile psycopg, I also needed mxDateTime from eGenix.
They offer several pre-compiled packages, but none would install for me.
Working from the source for 3.0.0, I was able to compile it myself via
python setup.py install” into my virtualenv sandbox.

Back in the psycopg build directory, I was able to use these
instructions
, but had to hack around a bit to get the mxDateTime
headers in a place that matched the psycopg build expectations. I tried
several variations of path names into the mx source tree, but eventually
gave up and copied them all to one directory:

$ cd egenix-mx-base-3.0.0$ mkdir include$ find . -name '*.h' -exec cp {} `pwd`/include/ ;

I then tried to configure psycopg with:

$ ./configure --with-postgres-libraries=/Library/PostgreSQL8/lib --with-postgres-includes=/Library/PostgreSQL8/include --with-mxdatetime-includes=../egenix-mx-base-3.0.0/include/

That failed to find the Python.h header until I ran configure outside
of my virtualenv environment, using the copy of Python 2.5 I had
installed ages ago from python.org. Obviously your path to the mx
includes may vary, but that installer package for the PostgreSQL server
will put everything in /Library/PostgreSQL8.

Once I had configure running, I ran make (still outside of my
virtualenv). The build succeeded, and then I went over to the shell
running my virtual environment to install from there (via a simple “make
install”).

The end result of all of that was PostgreSQL 8.2.5 installed globally,
and the mx 3.0.0 and psycopg 1.1.21 packages installed only in my
virtual environment.

After a quick createdb, and edit to my settings.py file, I was
able to sync up my dev server against the new database and get back to
work. I suspect, but can’t verify, that I would have had fewer issues if
I was on an x86 Mac of some sort or running Leopard, since many of these
packages seem to have moved on ahead of my platform. The whole thing
took just over an hour, most of which was me fumbling around trying to
find compatible versions of the source for the various pieces since it
has been so long since I’ve compiled any of this stuff.