Uptonian Thoughts

My Favorite Albums of 2015

· ·

This list is a little late, but when a contender for a top album of the year comes out on December 18, I like to wait until the actual new year to pare down the list. There’s no way that I could choose a “traditional” top ten. I had such a long list of amazing music that I listened to this year that even when I culled it to the things that were released this year I still had far too many.

I managed to come up with a healthy list of “top” albums and a few more “honorable mentions,” but really everything on this list is a favorite thing of mine from this year. This isn’t a “best of” list or anything of the sort. The albums aren’t in any order whatsoever and there are purposefully no numbers next to the list that follows. I tried to include a song and write a little blurb about my picks but really they are my favorite albums of the past year and are meant to be enjoyed as such.

Favorite Albums

  • Panopticon — Autumn Eternal

    I discovered this one late, just like Panopticon’s previous release, while reading Stereogum’s list of the top metal albums of 2015. There are some great records there, so go check that out if the heavier records on the rest of my list interest you.

    The album’s namesake track is a standout, but really the entire album is meant to be enjoyed at once. The Southern infusions apparent on Austin Lunn’s previous releases as Panopticon have mostly been replaced with colder, darker Midwestern roots – he moved from Kentucky to Minnesota recently – and that results in a heavier, colder sound.

  • Baroness — Purple

    The fact that this album was just released less than two weeks ago doesn’t mean it’s not a contender for best of the year. The songwriting is top-notch and the riffage is heavier than ever. It’s so cool to see these guys back touring and writing music after their much-talked-about bus accident in 2014. The energy and emotion (and positivity!) on display on this record is clearly a direct result of the aftermath of that experience.

  • Cult Leader — Lightless Walk

    From the moment the first note hits, you know this record is going to be brutal. That detuning chord makes your stomach drop, too, and “Great I Am” bursts through the subterranean cavern it just dug. It only gets more gut-wrenching from there. It’s not all just a bash-your-head-into-the-wall-of-sound experience though; there’s a dynamic on record that makes the restrained parts more beautiful and the chaotic parts uglier.

  • Kowloon Walled City — Grievances

    Slow, slower, slowest. I love that the production of this album is a first-class instrument. This conversation between guitarist and producer Scott Evans and Converge’s Kurt Ballou is a great read and offers some insight into the process of producing one’s own album.

    The song-writing is top-notch here. Heavy is not always about the riffage.

  • Alabama Shakes — Sound & Color

    Better production, fame, and an Apple ad feature don’t diminsh the soul and feel on display in this record. Every song is a microcosm of Alabama Shakes: big vocals, Southern instrumentals, a strong rhythm section, and lots and lots of emotion.

  • Deafheaven — New Bermuda

    So much has been written about Deafheaven this year, and it’s all better and more eloquent than anything I could write here. Seriously, read that Stereogum review.

    The bottom line is that this record is the exact opposite of what everyone, including me, expected a followup to Sunbather to sound like. It’s heavier, dirtier, harsher, more emotional, and – yes – better than Deafheaven’s previous effort. The heavier parts are fuller, and the shoegaze-y post-rock-y parts are more soaring and beautiful than ever. I can’t wait to see where Deafheaven goes from here, but for now I’m content to listen to New Bermuda over and over and over.

  • Intronaut — The Direction of Last Things

    “The Pleasant Surprise” is exactly apt for this record: I knew that Intronaut had a new album due out soon, but it was indeed a pleasant surprise in November when this album dropped. These songs have that signature Intronaut sound; the impeccable drumming, the funky basslines, and of course the great guitar work are all on display. But these songs are tighter than ever. The album starts with a ripper, then mellows out slightly, but the chops and intensity never let up.

  • Mutoid Man — Bleeder

    I had the pleasure of seeing Mutoid Man live twice this year: once in a brand new intimate venue on Red River, and once in a festival setting at Fun Fun Fun Fest. Mutoid Man are the rare band that excel in both environments. They just seem to have so much (dirty sweaty) fun playing, and it shows in this new set of songs.

  • Cloudkicker — Woum

    The influence of touring with Intronaut is apparent here: this sounds like the perfect mix of the mellower Intronaut sections and Ben Sharp’s impeccable guitar and production chops. I always look forward to new music from Cloudkicker, not least of all because I know it will be different and surprising and amazing.

  • Loma Prieta — Self Portrait

    This album is weirdly lo-fi in terms of production, but somehow it works. “Satellite” is a bit of a departure from their usual sound, but it still maintains the chaos without being quite as traditionally “heavy”. I haven’t seen a whole lot of press about Loma Prieta recently, but I know they’re touring with some success. This is a band that I have yet to see live, but I really want to hear these songs in person.

  • Foals — What Went Down

    I love everything about Foals. Everyone says this is the album that cemented them as an “arena rock” band, but their hooks have been huge for a while now. Fame and bigger venues hasn’t diminished the catchiness and unbelievable tightness of this band.

  • Rivers of Nihil — Monarchy

    Vocals are not what I usually focus on when listening to music, and that’s especially true for heavy music, but the vocal style and production on display on Monarchy is just perfect. It helps that the music behind those vocals is tight and nasty, too. This is a band that I discovered this year. I’m ashamed so say that I kind of dismissed them at first based on their name and album cover; it seemed just a little cheesy to me. I quickly got past that from the moment the album starts. The title track really showcases this band’s sound – that great vocal production, technical guitar solos, breakneck drums – so listen to it above and then listen to the whole album.

  • Vattnet Viskar — Settler

    Vattnet Viskar is a strange name for a band from New Hampshire – it means “the water whispers” in Swedish – and it certainly paints a picture before the first note is heard. Forget whatever you think from the band’s name and Settler’s album cover; this is American black metal at its finest. Comparisons to Sunbather are inevitable, but this is grittier and muddier and dirtier.

    A note unrelated to the content of this album: it was very hard to find this album; it seemed to go in and out of availability on streaming services. It’s available on iTunes now though, and hopefully it’s here to stay.

  • Between the Buried and Me — Coma Ecliptic

    More prog, less growls, and the amazing chops that we all know and love. This band keeps getting better and better. Colors is a nearly perfect album, and here we are four releases later to see that they have evolved their sound and found their pocket of prog metal that no one else can match. If you have not yet had the chance to see these guys live, make it your top priority when they next tour near you. They somehow manage to sound better live, which is a rare feat amongst bands in this genre.

  • The Armed — Untitled

    Another album from a new-to-me-this-year band. I listen to this when I’m running and when I need to tune absolutely everything else out while working. It’s cathartically brutal and mindless in the best way possible. A side note: how does Nick Yacyshyn find the time to play drums for all of these bands?

Honorable Mentions

  • Beach House — Thank Your Lucky Stars

    A “surprise” album that was even better than the one it definitely wasn’t a companion to.

  • Napalm Death — Apex Predator - Easy Meat

    Napalm Death, or this incarnation of it, is on point. Seriously, how does a thirty year old band put our their best material at this point in their career?

  • From First To Last — Dead Trees

    Maybe I’m biased because I backed this album on Kickstarter. Spencer Soletto’s vocals are a perfect match for FFTL’s heaviest and catchiest effort to date.

  • Battles — La Di Da Di

    Weirder than ever, which is definitely a good thing. And I still haven’t seen these guys live…

  • Courtney Barnett — Sometimes I Sit and Think, and Sometimes I Just Sit

    Great songwriting on display here. I love the refreshing take on “indie rock”; it’s kind of dirty and poppy at the same time.

  • Theories — Regression

    I found this band via the seemingly-endless stream of emails that Metal Blade sends. For some reason, this one stuck. “Burnt Concrete” is such a bangin’ opener.

    And for the record, I read and enjoy most of those Metal Blade emails; that wasn’t a complaint.

  • CHON — Grow

    CHON is definitely best experienced live, but this album is as good a facsimile as you can get of that experience. I do wish they’d try to play some of the songs with vocals live, but I’m content with their ridiculous instrumental chops. These guys are so young and fresh, too, so I can’t wait to see what comes next.

  • All Get Out — Movement

    I listened to this EP countless times when it came out. I have a feeling that their upcoming LP will make next year’s list.

  • Blanck Mass — Dumb Flesh

    Such a weird album, but I expect nothing less from one half of Fuck Buttons. I love the musical experimentation on display here.

  • Sumac — The Deal

    How can I not mention this supergroup made up of members of Baptists, Old Man Gloom, and Russian Circles? Weird and heavy.

Installing the Latest ZSH on Travis CI

· ·

I recently thought it might be a good idea to start using Travis CI to run builds of my personal repositories on a regular basis. A lot of my repositories are pet projects, but that doesn’t mean that I don’t depend on them on a daily basis.

That couldn’t be more true of my dotfiles. My zshrc and my vimrc get exercised tens if not hundreds of times per day. Sometimes I’ll make a change to test out something new, verify that it doesn’t blow up, commit it, and move on. That’s probably not the best way to do things, but I figure that I’ll never start using the new hotness if I don’t jump in and start using it right away. Usually this works out well and I’ve simply added a new tool to my repertoire, but it can potentially break my environment in subtle ways. Continuous integration can help with that: if I commit a breaking change, I can get an email when the “build” breaks. I’ll immediately know which commit broke something without having to resort to git blame or something similar.

Travis CI offers a fantastic free service, but I haven’t really had a chance to use it yet.1 I figured that setting up CI for my small dotfiles repository would be a great way to learn a tool that many open source projects use today.

There was one huge hurdle: I use zsh, and most of my dotfile setup scripts are written in zsh, but the Travis environment only comes with the bash shell installed.2

Some existing open source projects use Travis with zsh, but they all use the legacy environment that still allows sudo, not the newer container-based environment. The apt addon can help install packages in containers, but the latest version of zsh on Ubuntu 12.04 is 4.3.17. zsh 5 is a requirement for most modern usages, so that’s a non-starter. I thought that someone would have come across this already and solved it, and maybe they have, but I couldn’t easily find a solution.

We need to build and install zsh, and we need to do it without sudo. build-essential is already available on the Travis CI virtual machines, and we could use the aforementioned apt addon if it wasn’t.

After much trial and error, I finally got a Travis config that makes sure a recent version of zsh is set up before running the build. I chose to do this in the before_install step because that seems to be where additional dependencies should be installed, but I suppose it could be done anywhere in the build lifecycle before script runs the actual tests.

The full Travis config follows, but the before_install step is what really matters:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
language: sh
addons:
  apt:
    packages:
    - build-essential
before_install:
- export LOCAL="$(mktemp --directory --tmpdir=${TMPDIR:/tmp} local.bin.XXXXXX)"
- curl -L http://downloads.sourceforge.net/zsh/zsh-5.0.7.tar.gz | tar zx
- cd zsh-5.0.7
- ./configure --prefix=$LOCAL
- make
- make install
- cd -
- export PATH="$LOCAL/bin:$PATH"
script: make test

First, we make a temporary directory to install zsh to. Remember, no sudo means no access to /usr/bin/local, so we need to choose a safe location to install.

Next, we download the latest version of zsh. If you want to use a different mirror or download an archive with a different compression, you just need to change this line to handle that and the rest should work. If you change the version, make sure to update the following line that changes directories to the archive you just decompressed.

Then we build zsh: configure with the prefix set to the directory we created earlier, then make it and make install it to the prefixed directory.

Finally, we cd back to the directory we were in – these build lifecycle steps are run in series so we need to hop back to the directory we were in before building zsh – and, more importantly, we add the temporary directory to the beginning of our path so that zsh can be found.

These before_install steps could probably be extracted into a script, but I wanted to go with the simplest Travis config with the least overhead to get up and running with zsh. Now, when the test script runs, zsh is available and we can check our scripts for errors!

I look forward to exploring more of what is possible with Travis. On my horizon: using vint to lint my vimrc and writing tests for the majority of my private repositories that don’t currently have any verification.

  1. We use Jenkins at work, which is a blessing and a curse. Mostly the latter, but the sheer number of plugins available coupled with the fact that I work with some awesome people who know Jenkins better than I ever will makes it ok.

  2. Really, it comes with whatever shell is default for the operating system that Travis VMs use. Since they run Ubuntu 12.04, that means that bash is available but zsh is not. It seems like Travis didn’t set out to explicitly support shell-based projects, but because their machines are (mostly) Linux, that comes for “free” if you know how to configure things.

Migrating From Pathogen to Vundle

· ·

For a while now, I’ve been using Pathogen to manage my vim plugins as bundles. I thought I was making things easier for myself by using git submodules to help organize those plugins, but submodules aren’t the best method for deploying a vim environment in multiple places.

Fortunately for me, other, smarter people did this same thing and decided to fix it. This blog post by James Lai details moving from almost the exact same system as mine to Vundle, a newer and very vim-centric way of managing plugins. You list your plugins in your vimrc, you update them with vim (even though they are managed with git), and your vim environment is consistent everywhere.

I just migrated from Pathogen to Vundle, and I want to document the process.

First things first: clone Vundle into your bundle/ directory.

1
❯ git clone https://github.com/gmarik/Vundle.vim.git bundle/Vundle.vim

The Vundle quick start guide does a great job of getting you started, so if you’re just looking to use Vundle without any prior bundle management, I would start there. If you were managing your vim plugins with Pathogen and git submodules, the switch to Vundle is straightforward but requires a few more steps.

At the top of my vimrc I added a new Vundle-specific section and added the code from the quick start guide.

1
2
3
4
5
6
7
8
9
" Of course
set nocompatible

" Required Vundle setup
filetype off
set runtimepath+=~/.vim/bundle/vundle
call vundle#rc()

Bundle 'gmarik/vundle'

Next, I wanted to add all the bundles I already use. git submodule foreach can actually help here.

1
2
3
4
5
6
7
8
9
❯ git submodule foreach git remote -v
Entering 'bundle/airline'
origin  https://github.com/bling/vim-airline.git (fetch)
origin  https://github.com/bling/vim-airline.git (push)
Entering 'bundle/characterize'
origin  https://github.com/tpope/vim-characterize.git (fetch)
origin  https://github.com/tpope/vim-characterize.git (push)

[...]

I then just copied the GitHub user and repository name (the path portion of the remote URL minus “.git”) and passed that to Vundle’s Bundle command.

1
2
3
4
5
6
7
" Better status line
Bundle 'bling/vim-airline'

" ga for character descriptions
Bundle 'tpope/vim-characterize'

[...]

Now we need to remove the existing submodules. One of the big pains of using git submodules is removing them when you no longer need them. This made trying out plugins harder than it needed to be, and it makes the transition to Vundle a bit more complicated.

I reference this Stack Overflow post about removing git submodules every time I need to remove a plugin I was just trying out without fail. The command to remember is git submodule deinit. You can deinit all submodules in a given directory all at once.

1
2
3
4
5
6
7
❯ git submodule deinit bundle/
Cleared directory 'bundle/airline'
Submodule 'bundle/airline' (https://github.com/bling/vim-airline.git) unregistered for path 'bundle/airline'
Cleared directory 'bundle/characterize'
Submodule 'bundle/characterize' (https://github.com/tpope/vim-characterize.git) unregistered for path 'bundle/characterize'

[...]

Then you need to explicitly remove all the bundles. Since Vundle is already in bundle/vundle, we need to remove each separate plugin bundle directory instead of blowing away the entire bundle/ directory.

1
2
3
4
5
❯ git rm bundle/airline bundle/characterize [...]
rm 'bundle/airline'
rm 'bundle/characterize'

[...]

It woud be a good idea to commit your staged changes here, which at this point should just be submodule removal.

If you were using Pathogen, don’t forget to remove any setup from your vimrc.

1
2
3
4
5
6
" Store pathogen itself in bundle/
runtime! bundle/pathogen/autoload/pathogen.vim

" Start it up
silent! call pathogen#infect()
silent! call pathogen#helptags()

It also helps to add bundle/ to your .gitignore. Vundle now puts all plugins there, but you don’t have to manually manage them any more.. Just add bundle/** to your vim environment’s .gitignore file.

Now, open up vim and run :BundleInstall. All your vim bundles will be cloned and managed by Vundle, and you don’t have to worry about updating submodules and running git submodule foreach everywhere you have a vim environment.

To update and migrate any existing vim environments on other machines, git pull in the migration changes, which shouldn’t conflict as long as you were up to date to the commit before the migration. Then clone Vundle with git clone https://github.com/gmarik/Vundle.vim.git bundle/vundle and install your plugins with vim +BundleInstall +qall.

If you were not up to date to right before your migration changes, you’ll probably have to manually remove all submodules by following the instructions above and resolve some conflicts when pulling changes in. Since you’re trying to delete all of bundle/, this should be relatively painless: just delete bundle/ and git rebase --skip any submodule update commits.

And now you’re using Vundle! Add some new bundles into your vimrc, run :BundleInstall, and you’re up and running. I found some that I’m going to try out in joegoggins’s vimrc and in the Vundle setup documentation.

Quickly Convert Unix Timestamps

· ·

Inevitably, when dealing with time-related data, one will come across Unix timestamps. They’re great; there’s no guessing the timezone or trying to parse difficult formats and they’re generally extremely useful.

Except they’re not very readable to humans. I use Epoch Converter a lot when I’m dealing with time-series data, which seems to be fairly often recently. Anything involving a calendar or picking a time range also usually involves timestamps. I thought there must be an easier way to convert these into something that I can read without going to an external site and that doesn’t break down when I’m trying to do work with finnicky or non-existant network connection.

There is:

1
date -j -f "%s" "1381528800" +"%a %b %d %T %Z %Y"

An explanation: -j tells date not to try to set the system date. -f describes the format of the input - a timestamp, of course - and the input itself follows. The string after + is the output format. The output of the above is as follows.

Fri Oct 11 17:00:00 CDT 2013

You can use xargs to pipe input in. Note that the -J option might be different on systems that are not OS X.

1
echo -n "1381528800" | xargs -J {} date -j -f "%s" {} +"%a %b %d %T %Z %Y"

Like I seem to do with a lot of my utility scripts, I added a workflow to Alfred. You can get it here. Simply invoke Alfred, type “ts” followed by a space and the timestamp you want to convert. The readable date will be posted as a notification (Growl or Notification Center, configurable in the Advanced section of the Alfred settings) and copied to the clipboard.

Remove Leading Whitespace

· ·

Sometimes it’s useful to copy some code from a file and paste it somewhere else as an example. For instance, I like to write code in a jsfiddle and then paste a relevant subset of that code to a Stack Overflow answer. I have also copied code from a project file to paste into HipChat to quickly explain something to a coworker, or to paste into a JIRA ticket as a comment.

If you copy code from the middle of a file, there’s usually some leading whitespace on all lines that you do not want to preserve in the context into which you are pasting the code. The problem is that you don’t want to get rid of all leading whitespace on all lines; you want to keep the indentation intact. I usually get rid of the unwanted whitespace by manually deleting it if it’s only one or two lines or by pasting into a new vim buffer and using << to shift the text over as much as desired.

To illustrate, I have this text:

1
2
3
4
5
6
7
    var flatten = function(result, next_array) {
        console.log('current result', result);
        return result.concat(next_array);
    };

    [1, [2], [3, 4]]
        .reduce(flatten, []);

and I want this text:

1
2
3
4
5
6
7
var flatten = function(result, next_array) {
    console.log('current result', result);
    return result.concat(next_array);
};

[1, [2], [3, 4]]
    .reduce(flatten, []);

Remove leading whitespace

I knew there must be a way to remove the shortest leading whitespace from all lines programmatically, but I’m not familiar enough with awk, sed, or shell scripting in general to tackle the problem. I asked the question on Stack Overflow and got a few great answers. I ended up accepting the single process awk version.

If you use OS X, the built-in awk will not work with the given solution. If you use Hombrew, fixing that is just a simple matter of brew install gawk and using gawk instead of awk.

The given solution has a great explanation and works fine, but I made one addition. If the input is a single line with no leading whitespace, the script fails. I fixed this with if (!s) s=0; at the beginning of the END block.

The final version of my command looks like this. I’ve added some comments to explain what’s going on.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
gawk  -F '\\S.*' \              # The awk field separator is everything after the first non-whitespace character, inclusive
'{                              # The first block of the awk program
    l=length($1);               # The length of the first field, the leading whitespace
    if(l>0)                     # If the length of whitespace is non-zero,
        if(NR==1)               # and this is the first record,
            s=l;                # make 's', the number of whitespace characters, equal to its length
    else s=s>l?l:s;             # otherwise, make s the shorter of itself and the current whitespace
    a[NR]=$0                    # Index the entire line in an array by line number
}                               # End of the first block of the awk program

END{                            # Start the END, printing block of the awk program
    if(!s)s=0;                  # Make sure we always have a value for s
    for(i=1;i<=NR;i++){         # Loop over all records
        sub("^ {"s"}","",a[i]); # Substitute 's' whitespace characters with nothing
        print a[i];             # Print the line after substition
    }                           # End of the loop over records
}'                              # End of the END block of the awk program

It is worth noting that this probably only works on text with spaces for whitespace, not tabs or mixed whitespace.

Integrate with an Alfred workflow

The ability to remove the shortest leading whitespace with a shell command is great, but I really wanted a way to do this quickly with text on the clipboard. Alfred workflows make that possible.

I created a simple three step workflow with a hotkey trigger, a script action, and a clipboard output. You can download it here and use it with Alfred 2.

And now you can copy code in context and paste it with no leading whitespace wherever you want!

Batch Deleting Last.fm Scrobbles

· ·

I recently came home from a trip and synced my iPhone in iTunes in an attempt to scrobble the music that I listened to on that trip. I use Melo to scrobble tracks played in iTunes, and it usually works quite well because I never know it’s there.

I’m not entirely sure if Melo, iTunes, or Last.fm is to blame, but I ended up with a large amount of scrobbles from a repeated handful of songs. I have over 75,000 tracks scrobbled in Last.fm, but I use their recommendations and like to look at my stats, so artificially inflating my scrobble count with three artists was extremely undesirable.

I had 59 pages of unwanted scrobbles; I needed to quickly delete nearly 3000 scrobbles. Last.fm doesn’t have a way to batch delete (or otherwise manage) your scrobble tracks, so I manually clicked all the delete links on the first page.

That got old before I had even deleted ten scrobbles. I figured out a way to programmatically and quickly delete a page of scrobbles. I still have to manually get to each page, but this makes it much easier.


UPDATE 2017-05-09:

Thanks to a reader, I was informed that the site design has changed yet again, requiring a new selector for the delete button. The new code should now be as follows (thanks AN and Slype!):

1
2
3
jQuery('.chartlist button.dropdown-menu-clickable-item[type="submit"]').each(function(_, c) {
    c.click();
});

Again, I’ll leave the old code up for posterity.


UPDATE 2015-10-08:

Since the new Last.fm redesign recently went live after being in beta for a while, it looks like the selector in the above code needs to change. Thanks to @jayholler, the code should now be:

1
2
3
jQuery('.chartlist button.chartlist-delete-button').each(function(_, b) {
    b.click();
});

I am leaving the old code up for posterity and just in case the old site is still accessible somewehere.


1
2
3
jQuery('#deletablert a.delete').each(function(_, a) {
    a.click();
});

I just open up the console with ⌘-⌥-I, paste in that snippet of code, and hit enter. Here is what it looks like in action.

Batch deleting Last.fm scrobbles.

Since jQuery is already embedded in Last.fm’s pages, I just select all the delete links and emit a click on each one. The entire page is deleted in a few seconds. When it’s done, I can click the link to the previous page and repeat.

I don’t expect this will be very useful to anyone but myself for the next ten minutes, but it could come in handy in case you’ve been listening to too much Carly Rae Jepsen.

Duplicate Posts in the Feed

· ·

When I first switched over to Octopress, the feed for this site was completely new. The last twenty or so posts showed up as new again, and it was kind of annoying.

I realized yesterday that my site was still configured to be hosted on tupton.github.com, so I switched it to blog.thomasupton.com. This must have reset the site’s feed again, causing recent posts to show up as unread.

This should hopefully be the last time that happens, and hopefully it wasn’t too hard to mark all posts from this site as read. I apologize for any inconvenience.

Octopress

· ·

This blog is now being served by GitHub Pages (source) by way of Octopress. I’ve tried to set up some redirects from the old site that should match up pretty well, but please let me know if something looks awry.

There aren’t any site comments here anymore, so feel free to let me know on Twitter @thomasupton or email.

Reading My News

· ·

I’ve used Google Reader for a long time, but I’ve never been completely satisfied with using it to read and keep up with RSS feeds. Apps like Flipboard on the iPad provide a better reading experience for traditionally “newsy” outlets – I enjoy flipping through the feeds from The New Yorker, The Economist, and The Atlantic – but following feeds where every item is of personal interest to me doesn’t make much sense in that context, and it’s iOS-only. I had used NetNewsWire a few times in the past, but I never stuck with it for some reason. It’s a great app, but I would constantly get distracted by that big red badge with a huge number of unread items in it.

I realized that my problem wasn’t really the apps I was using. What really made reading my feeds imperfect – or even tedious – was the number of feeds I had and how I was reading them. I decided to revisit NetNewsWire and check out an app on iOS that I had heard great things about.

Prune

The first key to revamping my news reading was pruning my subscriptions. I had way too many high-traffic feeds, and I wasn’t even reading 1% of some of them. I would go through all my feeds and folders and mark everything as read every few days1 just to keep up and feel like I wasn’t overwhelmed.

I got rid of feeds from sites like Lamebook, which is hilarious, but I don’t need to read every single item, and Absolute Punk, which is a great source for music news, but I only want to read a very small portion of those stories. I get my fix from these sites by visiting them every few days, not by trying to forge through a murky river of hundreds of RSS items that I don’t want to read.

I now have 26 feeds in 8 categories,2 down from twice that before pruning. This is manageable. 50+ high-traffic feeds are not. I can read the items from these feeds quickly, and I can finally read “all items” without becoming overwhelmed.

NetNewsWire, revisited (again)

I mentioned that I had used NetNewsWire before, but it never stuck. A couple of weeks ago, I saw this NetNewsWire setup and knew I had to try it. ⌘-⇧-R and ⌘-/ are now my favorite keyboard shortcuts. Refresh, and then scroll through my unread river of items. I sort chronologically so I read posts “in order,” but I don’t think it actually matters much at all. Since I have a low number of feeds, it usually only takes a few minutes to travel through these unread items. Any links to things I want to check out in more depth later get a quick ^-P to send to Instapaper.

Reeder on mobile

I was recently on vacation without my MacBook, but I had my iPad and still wanted to keep up with my news and feeds. There are plenty of other times that I’m without my computer but have a chance to catch up on my reading. I figured (or hoped, really) that I could make the experience of reading news on my iPhone and iPad just as great as it is with NetNewsWire at my desk.

I had heard great things about Reeder on iOS, but I was skeptical that it could work for me. It turns out I can get nearly the same river of news setup as NetNewsWire in Reeder. Unread items can be sorted in chronological order, and moving to the next item is just a matter of swiping up.

The end result is that I have a streamlined way to catch up on news that’s important to me as quickly as possible, no matter where I am. I spend less time fiddling with news items I don’t really care about, less time out of my day being distracted by the latest hot article,3 and more time doing what I want and need to do.

  1. I did this ritual for far too long before I realized it needed to end. Old habits die hard.

  2. If only I could figure out how to stop checking Stellar so much.

Aeropress

· ·

Aeropress

I don’t think I’ll be using my Keurig much anymore.

I’m extremely late to this party, but I only recently discovered Aerobie’s Aeropress coffee maker. Yes, the company that created those amazing ring-shaped flying discs also sells a weird-looking coffee maker. This plunger-like contraption helps to brew what is probably the best cup of coffee I have ever tasted.

At first glance, it’s similar to a French press, but because the coffee is forced through a paper filter instead of a coarser wire mesh, there is little to no sediment in the resulting liquid. The coffee is described as “espresso-strength,” and I drink it as an Americano by adding a volume of water equal to two times the amount of espresso. I’ve made Aeropress coffe with dark and lighter roasts, and every cup of coffee has been exquisite.

If you enjoy coffee at all, I highly suggest you acquire an Aeropress as soon as possible. Once you taste its delicious brew, you’ll understand why. It’s inexpensive, quick, easy to clean, and delicious. There are no downsides.