The Next Chapter

September 2, 2015

I’ve just begun work at Ritual Development Group, Inc.. Ritual is a product-oriented software company. I’m serving as the CTO and will share more about what we’re building as time goes on. For now, suffice it to say that this is a huge step for me; It is the realization of a dream I’ve had for quite some time.

As wonderful as this is, it has meant making the very difficult decision to leave Big Nerd Ranch, where I have proudly worked for the past five years. I joined BNR in 2010, just as the original iPad was being released. Since that time, BNR has experienced tremendous growth.

At BNR I worked as an independent engineer, a team lead, an author, a teacher, and as the VP of Engineering. I was able to work with startup companies, government contractors, and billion-dollar multinational conglomerates. I feel fortunate to have learned a tremendous amount in that time, and quickly.

Teaching was one of the best parts of my time at BNR. As much as I love writing software, it’s really hard to beat a week spent teaching a group of nerds about platforms you love, especially when every student is excited to be there. The days for an instructor are long, but it’s a truly rewarding experience, even (or especially) when you’re sitting in the lab at 10:00 PM sketching out an app architecture with a student. BNR’s bootcamps are an exceptional experience. If you are serious about advancing as a software developer, I wholeheartedly recommend taking a class.

Co-authoring the fourth and fifth editions of Cocoa Programming for OS X was a remarkable experience. I couldn’t be more thankful to Aaron for the trust he placed in me. I am especially proud of the fifth edition, which my friend and colleague Nate co-authored with me. While the fourth edition was an important update, but the fifth edition was truly an overhaul, a major version release with substantial refactoring. We focused on demonstrating modern app architecture and style while also transitioning from Objective-C to Swift.

All of my prior work has stretched me, but the opportunity to work as the VP of Engineering stretched me in a whole new dimension (or two). Looking back on it, I believe we accomplished a lot. But it’s also challenging to fully take stock of it. I owe many thanks to my team of engineering directors: Brandy, Brian, the Chrises, and Steven. Working with them and all of the Nerds was a truly rewarding experience. It’s hard to part ways with such a fine group of people.

Looking back over my time at BNR, it’s almost unreal how much I grew there, even in some of the simplest ways. I’m a fairly quiet person and the fact that I can happily deliver a talk on a chapter from the Cocoa book feels very incongruous. I also drink coffee now. I owe a great deal to all of the people at BNR who gave me so many chances to grow over the years.

So here we are. This chapter has come to a close. I’m grateful for all that’s past, and I look with great anticipation to what’s next.


Handling Keyboard Input in a View Controller – Alongside SpriteKit

April 15, 2015

I’m using SpriteKit in the train game I’ve been tinkering with on and off, and recently went to add keyboard event handling to my view controller:

override func keyDown(event: NSEvent) {

override func deleteBackward(sender: AnyObject?) {

To my surprise, though, whatever was first responder wasn’t passing keyDown: up the responder chain. (This is 10.10 so view controllers are in the responder chain automatically.)

First, determining what was first responder was easy enough. After showWindow: had been called:

(lldb) ex windowController.window!.firstResponder
(SKView) $R2 = 0x0000000100e1b100 {

The SKView is the first responder. I had forgotten that SKNode is a subclass of NSResponder and that nodes can participate in the responder chain.

So why wasn’t the keyDown: propagating up to the view controller? I checked the nextResponder of the SKView: it was the same as its superview, which is to be expected. The question remains: why wasn’t keyDown: propagating?

For the answer to that I turned to Hopper Disassembler, which revealed that -[SKView keyDown:] essentially sends keyDown: to its scene:

void -[SKView keyDown:](void * self, void * _cmd, void * arg2) {
    rbx = self;
    r14 = [arg2 retain];
    if (rbx->_disableInput == 0x0) {
        rdi = rbx->_scene;
        if (rdi != 0x0) {
            [rdi keyDown:r14];
    rdi = r14;
    rax = [rdi release];

So how can I deal with this? I could subclass SKView and override keyDown: to give it a more traditional implementation (pass it on to nextResponder, for example). However, while I was in Hopper I noticed that SKScene doesn’t implement keyDown:, which means that it is relying on the default implementation of keyDown: in NSResponder. Which means that the simplest solution is to set the scene’s next responder to my view controller:

let gameScene = GameScene(size: ...)

assert(gameScene.nextResponder == nil)
gameScene.nextResponder = self

Now my view controller receives keyboard events: problem solved! My guess is that SpriteKit’s designers intended for the keyboard event handling to be done in an SKScene subclass – that’s how it’s done in the Adventure sample code. Unfortunately that doesn’t make sense for my app architecture, where there’s more to the UI than an SKView.

Tags: ,

A Quick Fix for Fuzzy SKShapeNode Lines

February 16, 2015

I’ve been using Apple’s SpriteKit for the train game. It’s gone fairly well, except there have been a few unpleasant surprises with SKShapeNode, which strokes/fills a bezier path. The internet is filled with folks complaining about buggy behavior from it and I’ve definitely had more than a few moments where I daydreamed about dusting off my OpenGL chops.

This is an A/B screenshot of an issue I was running into. At left is before, right is after:

These are SKShapeNodes with paths for the track segments, which are 2pt wide, stroked, and, most importantly scaled up several times (zoomed in on the map). The issue of course is the blurry/fuzzy line rendering.

The ‘fix’ was startlingly simple: setting antialiased = false makes your blurry lines crystal clear when scaled up. This isn’t at all what I was expecting (antialiasing off == stair-stepping, pixelated lines in my mind).

My guess is that this is an optimization. For SKShapeNode, antialiased means it will be drawn on a texture at unscaled resolution and that texture is then scaled as needed. With antialiased off, the curve drawing occurs at the current resolution. This is only reasoning off of the visual results. I don’t notice any significant difference in CPU overhead when antialiased is off.

It’s also curious because SpriteKit seems to perpetually re-render SKShapeNodes whether they have changed or not. At some point I will likely end up writing code to render them to textures, or just moving over to OpenGL and rendering the tracks myself.

At this point in development the visuals are secondary, however. I ran an experiment with SKShapeNode’s strokeTexture and aside from some strange rendering artifacts (texture scaled unevenly), the result was much too busy. For now I rather like this abstract appearance for the tracks.


A Train Game: Prototypical vs Playable

February 15, 2015

Late last year I was playing Artemis Spaceship Bridge Simulator and, being the sort that likes trains, I thought, wouldn’t it be cool if there were a collaborative game that simulated running a railroad? So late in December I started writing some Swift code, building the game as a Cocoa app. Here’s what it looks like today:

Here’s some video of it from around the first of this year:

The vision is for several players to operate a railroad together. One or two dispatchers, a few engineers cycling between trains. Switching cars in yards, moving trains up and down the mainline, fulfilling orders from industries. A game could run an hour or two, or perhaps a number of sessions over the course of a few weeks.

It’s come a long way over the past two months. There’s a server that clients can connect to and operate the trains, couple and decouple cars. A basic map editor. It’s a lot, or it feels like a lot to me, but at the same time it still feels very primitive. It’s a prototype, or possibly a toy, but not yet a game.

Motivation can be a curious beast, especially with competing priorities. With the basics in place, feature work becomes less satisfying. It’s time to start playing with making it a game.

At its core this game is a simulation of the operating sessions that some model railroaders conduct in their basement, but without the limitation of space. This makes it possible for maps to be very realistic, modeled on real places.

That realism presents some interesting challenges in terms of presenting the information. This is a fairly simple classification yard and already it’s pretty challenging to take in:

Picture those tracks full of cars and weep. I weep for joy and terror: there’s something very pleasing about vast arrays of parallel tracks, as well as extremely daunting if you are expecting me to make sense of it! At the moment my favored approach to this is to add a toggle-able HUD that color codes cars by their destination.

Here’s another area that needs work. Perfectly clear, right?

From here there will of course be more feature work – such as on those switch track signals – but more importantly I plan to work on the model for the industries and yards so that there is something to accomplish in the game, aside from randomly coupling and decoupling cars.

Tags: ,

Iterating Over a Range of Dates in Swift

September 7, 2014

Updated 2016/8/25: Made a few style tweaks and fixes for Swift 2.2; fixed start date issue (thanks, gist commenters!).

One thing I’ve been wanting to do with Swift is iterate over a range of NSDate objects in a for loop. Something like this:

let startDate = ...
let endDate = ...
for date in startDate...endDate {

While I think it might be possible to do this by making NSDate conform to ForwardIndexType, it would be fairly inflexible. As I understand date arithmetic, to do it right you need a reference to the NSCalendar being used, and of course you need to know how much to ‘step’ the date each time. You could just make it step by days but what if you later want to step by hours?

So I decided on a different approach: create a struct, DateRange, that conforms to SequenceType. It’s not nearly as succinct, but it is much more flexible. Create an instance of the struct using an extension on NSCalendar, as this seems to be in keeping with calendar-dependent date APIs. It looks like this:

let calendar = NSCalendar.currentCalendar()
let startDate = ...
let endDate = ...
let dateRange = calendar.dateRange(startDate: startDate,
                                     endDate: endDate,
                                   stepUnits: .Day,
                                   stepValue: 1)

for date in dateRange {
    print("It's \(date)!")

The complete code is below (also in a gist), but first a bit of a disclaimer: this code works, but I half expect to look back on it in a year, cringe, and contemplate deleting this post. My crystal ball of Swift faux pas is cloudy.

Note also that at the time this was written, NSDate did not have any Swift comparison operators built in, so I implemented >. Presumably that will change.

import Foundation

func > (left: NSDate, right: NSDate) -> Bool {
    return == .OrderedDescending

extension NSCalendar {
    func dateRange(startDate startDate: NSDate, endDate: NSDate, stepUnits: NSCalendarUnit, stepValue: Int) -> DateRange {
        let dateRange = DateRange(calendar: self, startDate: startDate, endDate: endDate,
                                  stepUnits: stepUnits, stepValue: stepValue, multiplier: 0)
        return dateRange

struct DateRange :SequenceType {
    var calendar: NSCalendar
    var startDate: NSDate
    var endDate: NSDate
    var stepUnits: NSCalendarUnit
    var stepValue: Int
    private var multiplier: Int
    func generate() -> Generator {
        return Generator(range: self)
    struct Generator: GeneratorType {
        var range: DateRange
        mutating func next() -> NSDate? {
            guard let nextDate = range.calendar.dateByAddingUnit(range.stepUnits,
                                                          value: range.stepValue * range.multiplier,
                                                         toDate: range.startDate,
                                                        options: []) else {
                return nil
            if nextDate > range.endDate {
                return nil
            else {
                range.multiplier += 1
                return nextDate

// Usage:
func testDateRange() {
    let calendar = NSCalendar(calendarIdentifier: NSCalendarIdentifierGregorian)!
    let startDate = NSDate(timeIntervalSinceNow: 0)
    let endDate = NSDate(timeIntervalSinceNow: 24*60*60*7-1)
    let dateRange = calendar.dateRange(startDate: startDate,
                                         endDate: endDate,
                                       stepUnits: .Day,
                                       stepValue: 1)
    let datesInRange = Array(dateRange)
    XCTAssertEqual(datesInRange.count, 7, "Expected 7 days")
    XCTAssertEqual(datesInRange.first, startDate, "First date should have been the start date.")

Southern 4501 at TVRM

September 6, 2014

Norfolk Southern’s Tumblr and Flickr tend to have some pretty great train photography on them. Recently they’ve had some shots of Southern 4501, which has been restored by the Tennessee Valley Railroad Museum up in Chattanooga.

In particular, this shot of 4501 and 630 at TVRM is pretty sharp: Southern 2-8-0 630 and 2-8-2 4501 Simmer Side By Side. I’d embed them but they aren’t on Flickr.

Update 9/6/2014: They shot some video, too: Southern 4501 and 630 Working Together.

Update 9/7/2014: This overhead shot is pretty outstanding.

We went up to TVRM a couple months ago and had a great visit. The Missionary Ridge Local that they run uses 630 as its power. It’s a 45-minute out-and-back excursion with a 15 minute break in the middle where you watch them turn the locomotive on a turntable and run it back to the (new) front of the train to take it back to the station.

Below is a photo I took during that visit of 630 arriving as NS 9686 kindly passes in the background.

NS 9686 & Southern 630 at Grand Junction

Tags: ,

Jean Teasdale on Twitter

September 3, 2014

Veteran The Onion writer Jean Teasdale describes Twitter in her latest opinion piece (emphasis mine):

It’s a free internet site where you can share a personal moment, crack wise, show a photo of a kitty, and beat people to caring about things that literally happened seconds ago, all in 140 characters or less.

This may be Jean Teasdale’s masterpiece.

While I’m praising Jean, I would be remiss if I did not link to her home page. The URL alone is genius.


Telefontornet and Software

September 3, 2014


Today on the always wonderful Colossal I learned of Telefontornet, 1890s Sweden’s central telephone exchange and the enormous structure atop it at which all lines converged.

It reminds me of my early career as a software developer: ISO/IEC 13818 in hand, writing an MPEG-2 demultiplexer and stream parser from the ground up. Doing things I didn’t know couldn’t be done, to paraphrase Sirkis.

That work wasn’t stringing up telephone wires in all directions across the sky, but in software terms it wasn’t that far off. 13 years of professional experience down the line, I wonder how I would approach that same project today.

For one thing there would be some unit tests written.


iBeacon At The Arcade

February 18, 2014

A few years ago I got involved in pinball software development in a pretty big way. It’s faded over the past year, but it’s still one of my favorite cranial playgrounds for software ideas. I continue to be passively obsessed with contemplating new ways of writing pinball software (“Could you write good pinball software in Lua? What about Haskell? How would that look?”). Mercifully for my family they remain mental exercises.

There are two topics that tend to come up quite a bit in discussions about pinball technology, at least when I am involved. The first is that I should have a player profile on the pinball machine, like my iOS Game Center or Xbox Live account. This would allow for internet high score boards that go beyond initials, but also make it possible to change the open/closed nature of a pinball game. Suddenly my game could have persistent state (pinball RPG?). Stored power-ups for use in later games. Lots of really interesting possibilities here, both for home and location players.

Before I get to the second topic, there’s a problem to discuss. The problem with this player profile idea is that it’s hard for a player to sign into the game. Pinball machines do not have rich input devices. Usually it’s three buttons: flipper left, flipper right, and start. (“Was my password LLLRL-Start or LLLLRL-Start?”) For various reasons a touch screen is technologically hard or very awkward (it would get in the way of the flippers). RFID or mag stripe cards are an option but then you have to have some way of distributing the cards. I bet the Taco Mac bartender really wants to hand those out. (Jersey Jack tried something like this with ePlate. Something tells me it hasn’t taken off. Maybe it’s the part where I have to sign up for a credit card to participate.)

There’s a better option: quite a lot of us carry around very rich input devices in our pockets, of course, which brings us to the second topic: how can I connect my phone with a pinball machine? If your phone can communicate with the pinball machine then you suddenly have a very natural means of identifying yourself to the pinball machine. The trouble is that, until recently, doing this was also hard and awkward. The machine itself would need to be on wifi, maybe you’d have to join the same wifi network on your phone, or shoot a QR code to tell the app which game you’re standing next to… Like I said, awkward.

So what happened recently that changes this? Last summer Apple announced iBeacon as part of iOS 7. What’s fairly cool about it is that it isn’t just an iOS thing – Android phones support it too (it’s just Bluetooth Low Energy, BLE). iBeacon solves the problem of connecting your phone with the pinball machine by allowing the connection to take place over Bluetooth, without pairing. There’s proximity data and the devices can even exchange information.

The possibilities here are pretty interesting, even aside from the idea of signing in to your player profile. A pinball manufacturer could offer iOS and/or Android configuration utilities, making for a much richer management interface for owners and operators of these games. Technically you could even sidestep connecting the pinball machine to wifi at all, if you relied upon the player’s phone for relaying information up to a server. It would even allow co-located games to form a mesh network of sorts, for example for head-to-head play, although there are perhaps simpler ways of solving that problem.

iBeacon seems like a pretty natural fit for retail and vending machines. A lot of folks are excited about using it for payments, as an alternative to NFC. Imagine coining up a game using IAP on your phone. You could even take your credit balance with you when you get a great score. Free loyalty credits, social marketing possibilities (“Tweet this for five free credits” (yes, I hate myself a little for suggesting this)). Of course all of this can apply to video arcade games, jukeboxes and so forth.

As an aside, one of the great things about iBeacon is that it’s very accessible for experimentation. You can pick up beacons fairly cheaply (Estimote, Bleu), and you can also make your own device act as a beacon. The iOS APIs are pretty straightforward (Region Monitoring, CBPeripheralManager).

This is an interesting time for pinball. A number of new, smaller game makers have popped up, like Multimorphic and Dutch Pinball, just the kind of manufacturers that could be ready to try something like this. Here’s hoping.

Tags: ,

Musings On Creating Immutable Collections Mutably

April 3, 2013

Sometimes we need to create an array whose length and members are not known at compile time. Oftentimes that looks something like this:

NSMutableArray *things = [NSMutableArray arrayWithCapacity:numThings];
for (int i = 0; i < numThings; i++)
    [things addObject: ... ];
// use 'things'...

If our collection is not to be changed later, and we’re feeling particularly pedantic, we might do something like this:

NSMutableArray *mutableThings = [NSMutableArray arrayWithCapacity:numThings];
for (int i = 0; i < numThings; i++)
    [mutableThings addObject: ... ];
NSArray *things = [mutableThings copy];
// use 'things'...

But mutableThings is still accessible. Best to tuck it away:

NSArray *things;
    NSMutableArray *mutableThings = [NSMutableArray arrayWithCapacity:numThings];
    for (int i = 0; i < numThings; i++)
        [mutableThings addObject: ... ];
    things = [mutableThings copy];
// use 'things'...

But that looks pretty unpleasant. How can we improve upon this?

One approach would be to use blocks in a simple fashion. The compiler’s inference of the return type can make the equivalent of the above a bit more graceful:

NSArray *things = ^{
    NSMutableArray *mutableThings = [NSMutableArray arrayWithCapacity:numThings];
    for (int i = 0; i < numThings; i++)
        [mutableThings addObject: ... ];
    return [mutableThings copy];
// use 'things'...

I like this, but there’s a downside: we can’t use Step Over in the debugger to step through this code (we can still Step Into). You could also consider that an upside, depending on what you’re debugging.

We can take this a step further by creating a category on our favorite collections objects, and use it like this:

NSArray *things = [NSArray ap_arrayViaMutableArray:^(NSMutableArray *things){
    for (int i = 0; i < numThings; i++)
        [things addObject: ... ];
// use 'things'...

A strong advantage here is that we’ve abstracted away all of the alloc, init and copy noise. Debugging may be a bit more cumbersome, since Step Into is going to step into the implementation of our category method before it gets to calling the block.


SwiftText 1.1.0

October 30, 2012

I’m pleased to announce that SwiftText 1.1.0 is now available! Here’s what’s new:

  • Retina support.
  • Font is customizable.
  • SwiftText may optionally be kept on top of all windows until it is dismissed.
  • Added “Append to SwiftText” Mac OS X service.
  • Sandboxed.
  • Improved status item behavior.

Another Octopress Blog

September 9, 2012

I made the transition from WordPress to Octopress. I’d been a WordPress user since sometime in 2003, when I started my 1128 blog. WordPress was incredible: the installation was remarkably painless and the features (admin interface, layouts, etc.) were top notch as well. Over the years WordPress has improved quite a bit, too. But WordPress has also grown somewhat notorious for performance issues (under heavy load, anyway), and there were occasional security problems as well.

Perhaps due to these challenges, static site generators like Octopress are becoming more popular. The advantage is that the code runs on your local machine and generates a bunch of HTML files. Then you upload (rsync, in my case) them. This makes your site faster – it doesn’t have to converse with a database server to get the posts – and a lot more secure. There’s no PHP script running on the server with bugs waiting to be compromised. The hosting account itself is still hackable, of course, but there’s only so much we can do about that.

So when my friend Robert emailed a couple months ago to let me know that my blog had some link spam in the footer (coded so that it didn’t when I visited the site), I decided it was time to say goodbye to WordPress. I blog rarely enough that it needs to be frictionless, and though it was interesting to track down where the spam was being injected, I doubted it would be as interesting in future occurrences. It was time to get serious about Octopress.

As an aside, I did spent some time trying out Hyde, because Python is higher on my list of preferred languages than Ruby (important if I ever wanted to tinker with the code) but Hyde is not (yet?) as much of an off-the-shelf blogging platform as Octopress is. Octopress won.

For the most part the switch was easy; I used Matt Gemmell’s post as a reference. Initially there were no comments, as it is A) fashionable these days and B) it’s a static site. However, this weekend I opted to setup Disqus, which is akin to off-shoring your comments.

I do miss WordPress to a certain degree. It was much easier to sign into the admin interface and compose a new post, write a little, and decide to save it as a draft (and forget about it). With Octopress it’s a more manual process.

  • rake new_post['Some Title']
  • Fire up TextMate 2 and open the .markdown file.
  • Write.
  • Don’t finish post for various reasons. Go to the Octopress docs to remind myself how to set a post as a draft (note to my future self: published: false).
  • rake preview / rake generate / rake deploy

The downside of this is that if you never finish a post (or decide not to post it) and never set it to be a draft, when you come around later and actually post something you could easily inadvertently post your not-intended-for-publication post. So some care is required. For now I’ve modified the Rakefile to include the published: false line in the new_post template.

You served well, WordPress, and you will be missed. My thanks to the many developers who made WordPress great.


Launch at Login Implementors Support Group

September 6, 2012

I’ve had a number of improvements in the pipeline for SwiftText 1.1.0 for quite some time, and finally submitted it to the Mac App Store tonight. One of the most difficult features to get right was Launch at Login. This was a feature that was in 1.0 and was fairly easy to implement at the time. However, in this brave new world of sandboxed apps, launch at login is a bit of a beast. If you’re thinking about adding it to your application, or perhaps you’re struggling to make it work, perhaps this post will help.

Sandboxed apps need to use a helper tool to do the actual launching for them. In short, the helper app is bundled under Contents/Library/LoginItems and its bundle identifier is passed to SMLoginItemSetEnabled(). delite studio has a very helpful post on this, but you’ll want to make sure to read Mike Cohen’s own post containing some key corrections. Your helper app will need to do some path acrobatics to concoct the path to your main application, which it launches with -[NSWorkspace launchApplication:], and then calls [NSApp terminate:nil].

If you follow the above, you should have a working launch at login feature. There was one more hurdle for me to overcome, however. My application passed the Validate step within Xcode, but when I uploaded the binary I was told my application had an “Invalid binary”:

Invalid Provisioning Profile Location - The provisioning profile for your Mac OS X app must be located in the Contents directory of the main app bundle. A provisioning profile is optional, but you cannot submit more than one.

What the iTunes Connect robot was trying to tell me was that my helper tool was, evidently, incorrectly signed. Using this Stack Overflow answer I was able to re-sign it in a Run Script build phase, which passed the robot’s muster. (One note about that answer: I found that copy-pasting the lines from the answer yielded a non-working script. By retyping the paths I was able to get it working. Bad characters in the mix, I presume.)

Why so much work to get such a simple feature working? The launch-at-login APIs were never developer-friendly: the old Launch Services method was a couple dozen lines, all told. Here we have less code, but many more obstacles to maneuver. Why the helper application, though? As far as I can tell the SMLoginItemSetEnabled() call (part of the Service Management Framwork) is not really intended for matters so trivial as launch at login. I suspect that it was simply the last man standing after the Launch Services method was knocked out by sandboxing. From a developer’s perspective this is a pretty unpleasant pattern. I see three possible explanations:

  • Apple didn’t have time to concoct a new sandbox-friendly API for this in time for 10.7/10.8.
  • Apple doesn’t see it as unpleasant, or not unpleasant enough.
  • My expectations are too high.

My money’s on the first. With the Service Management method, the user must use the application itself to toggle launch at login. Applications with this enabled don’t show up in the Users & Groups Login Items pane, nor do they show a checkbox in the ctrl-click menu on the dock. Not the best user experience. Perhaps there will be a better solution in 10.9. Now to write a bug in hopes that it will help that happen. Here’s what I’m thinking:

@interface NSApplication (LaunchAtLogin)
@property BOOL launchAtLogin;
Tags: ,

MTRandom: An Objective-C Random Number Generator

September 3, 2012

Most of the time when we need a random number, we use srandom() & random(), or perhaps arc4random(). In the old days we might have used srand() and rand(), until somebody told us that random() was better.

Sometimes we want a sequence of random numbers to be reproducible. Say we’re writing our own Minecraft and we want users to be able to share seeds for the landscapes they find. The above functions work fine for that, as long as we’re the only one using them. Unfortunately they’re not thread safe (see my test code to confirm). They manage a single state behind the scenes, so if we wanted to run, say, two landscape generation routines in parallel, we’d be out of luck.

That’s where MTRandom comes in. MTRandom is an Objective-C wrapper for Mersenne Twister, a pseudo-random number generator.

MTRandom *random = [[MTRandom alloc] initWithSeed:5];
uint32_t r = [random randomUInt32From:5 to:10];   // [5, 10]
double   s = [random randomDouble];               // [0.0, 1.0]
double   u = [random randomDoubleFrom:0 to:M_PI]; // [0, 3.14159...]

There are a few more examples in the README.markdown.

It also conforms to NSCoding and NSCopying, so you can archive it with game state, or make copies when searching a game tree. The repository includes a set of basic unit tests, and best of all it’s BSD licensed (even Mersenne Twister). It’s available on GitHub:


Xcode Snippets

July 25, 2012

Xcode’s snippets feature is rather handy. It’s what drives many of the autocomplete templates. For instance, if you type init on a new line and hit return, Xcode will create a template for an -init method.

Until recently I haven’t bothered to create my own snippets, which can be done in three easy steps:

  • Selecting a block of text you wish to make a snippet.
  • Drag it onto the Code Snippet Library, which is part of the Utility Area (lower right hand side).
  • Customize by entering the Completion Shortcut, or wrapping tokens in <#token#> so that you can tab between them after the snippet has been inserted.

I do a fair amount of Quartz development, so today I created a a snippet which has already proven useful. I’ve set its completion shortcut to cgbitmapcontextcreate:

CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef ctx = CGBitmapContextCreate(<#data#>, <#width#>, <#height#>,
	                                     8, <#width#> * 4, colorSpace, 

I’ve made a separate one just for color spaces, cgcolorspace, as for some reason the repetitiveness of creating and releasing RGB CGColorSpaceRefs has lost its luster.

CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

Thankfully you can stomp all over the symbol names that will be otherwise provided by autocomplete, so when I’m about to create yet another CGColorSpaceRef, perhaps forgetting my snippet, Xcode will show me the snippet as the first autocomplete option in addition to the usual bunch.

A pleasant side effect of this is that you can fix some of Xcode’s annoying autocompletes, due to how snippets take priority over other autocomplete suggestions. For instance, when I type NSStri in Xcode, the default autocompletion is NSStrikethroughStyleAttributeName. Probably not what I wanted. I can use snippets to fix this by creating an NSString snippet with the very same completion shortcut, and voila, autocomplete gives me just what I want when I next start typing NSString.

Tags: ,

Briefly on Sparrow et al.

July 21, 2012

Beautifully designed 3rd party apps are part of what makes the Mac platform such a great platform. We can be fairly emotional about our beloved Mac ecosystem. So it’s natural to feel injured when we learn that one such app has suddenly become encased in ice. Matt Gemmell is right, however, when he says:

People try to dress their reaction up as a principled stance or a community cause, but that’s at best wrong-headed thinking, and at worst wilfully egocentric bullshit.

Don’t worry; he’s not going to sugar-coat it any more than that.


APLevelDB: An Objective-C Wrapper for LevelDB

January 30, 2012

TL;DR: I made an Objective-C wrapper for LevelDB, called APLevelDB.

I’ve been enjoying tinkering with Redis lately, which has left me wanting something similar for Mac and iOS projects. Tokyo Cabinet seems to be the big name in speedy key-value stores, but its LGPL license leaves much to be desired. I finally settled on LevelDB, a similar offering from Google with a much friendlier license (New BSD).

LevelDB’s C++ API is pretty reasonable, but since I don’t like naming all of my files .mm, and stylish Objective-C APIs are part of what make developing in Cocoa/iOS so much fun, I decided to write my own. I hope you’ll agree that it’s quite pleasant to use; it’s called APLevelDB. It has a few opinions, but leaves the big decisions to you. You can find usage examples in the readme.

Tags: ,

A Raindrop for Acorn

January 17, 2012

CloudApp is one of my favorite tools for sharing images quickly, and I also use Acorn quite a bit for quick image editing tasks. I’ve been wanting a quick way to upload a snapshot of what I’ve got in Acorn using CloudApp, so tonight I wrote AcornRaindrop.

AcornRaindrop exports the active document in Acorn as a PNG and uploads it using CloudApp. It’s a Raindrop, which is a plugin for CloudApp. You can download it, or peruse the source on GitHub. It is made available under the MIT License.

Note that testing at this point has been ridiculously limited. It works for me; hopefully it will work for you too!

The implementation is rather simplistic: it uses osascript to instruct Acorn to export the active document, then passes the result along to CloudApp. My first attempt was using NSAppleScript, but for reasons I don’t fully understand it blocked for about 30 seconds while running the script, but only under CloudApp. Something to do with loading frameworks, perhaps? At any rate it runs quite quickly with osascript.

Happy CloudApp+Acorning!

v1.0.1, 1/18/2012: Gus Mueller was kind enough to contribute some changes demonstrating how to connect directly to Acorn using NSConnection, so the osascript call is now history. This allows AcornRaindrop to base the filename uploaded to CloudApp on the filename of the exported document.

Tags: ,

All Too Familiar

September 19, 2011

There was a post on The Guardian’s Technology Blog by Matthew Baxter-Reynolds last week. The piece couches itself as an overview of mobile platforms for the developer looking to get his or her feet wet.

Turning its attention to iOS, The Guardian cuts with broad strokes on Objective-C and Xcode (“horrendously, absolutely awful”), and then opines:

The fact is that if your day job involves sitting in Visual Studio writing C# applications, or building Java applications in Eclipse (which will be most of you — albeit not necessarily in Eclipse), when you fire up Apple’s Xcode and start building CocoaTouch applications in Objective-C you’re going to come face-to-face with a toolset that has not had the sort of love put into it that the open source community has put into the Java toolset and associate platforms, or that Microsoft has put into VS and .NET over the past 10 years.

Today, John Gruber replied:

Objective-C is different than C++ or Java. Xcode is different than Visual Studio or Eclipse, and Xcode 4 is very different from previous versions of Xcode. Baxter-Reynolds certainly wouldn’t be alone in saying that he doesn’t like these differences. But it’s curious to argue Apple developer tools and frameworks are deficient due to a lack of time put into them. In numerous ways, both linguistically and tools-wise, Xcode, Objective-C, and Cocoa/Cocoa Touch are the evolutionary descendants of the NeXT developer platform from 1989.

Both of them are right, and both of them are wrong. Baxter-Reynolds’ “horrendously, absolutely awful” comment is ordinary internet grandstanding (aka trolling). The iOS platform would not be where it is today if Objective-C and Xcode were horrendously awful. Part of the success of iOS is that it is the first great mobile development platform. I’d wager that Gruber is partially correct in thinking that Baxter-Reynolds’ viewpoint here is rooted in fear/dismissal of the unfamiliar.

Where Gruber is wrong, however, is in his own dismissal of Baxter-Reynolds’ argument as simple, again, fear of the unfamiliar. Before I landed my dream job writing Mac and iOS apps (at last!), I spent many years with Visual Studio writing C++ and later C#. And while Xcode has been improving by leaps and bounds, let me tell you: Microsoft has set the bar very high with Visual Studio.

Visual Studio excels in two areas: debugging and autocomplete. The Visual Studio debugger is blazing fast. Stepping through statements is super speedy, and the variables view would keep up without breaking a sweat. Xcode’s debugger has a tendency to take its time when stepping, and the variables view can be ridiculously finicky.

My perception is that part of this sluggishness has to do with gdb, and maybe lldb will help to improve this over time. If you’re thinking that debugger stepping speed is a petty thing, think again: just like responsiveness in iOS is clearly a big deal – something Apple has put a lot of energy into – so is responsiveness in a developer’s tools.

Xcode’s autocomplete has come a long, long way in the last couple years. However, it’s just not a fair fight against autocomplete with a strong typed language like C# or Java. Strong typing allows the IDE to provide fast, accurate autocomplete. With Objective-C’s dynamic typing, autocomplete becomes more of an art than a science. Additionally, the dot notation syntax of C# and Java give the IDE another leg up: the dot is the perfect cue to bring up autocomplete. Objective-C does not lend itself to this quite so easily.

As to the question of the amount of time put into Xcode versus Visual Studio, it’s worth pointing out that Apple’s Developer Tools team is almost certainly significantly smaller than Visual Studio team.

Xcode has come a long way, and it has one really amazing component – Instruments – that, as far as I know, Visual Studio doesn’t have anything close to (that isn’t a commercial add-on product). But Xcode is lacking in some key areas that can make a big difference in a developer’s perception of tool quality. I think it’s important that we, as proud Apple platform developers, are honest about that.


SwiftText 1.0

May 3, 2011

SwiftText for Mac I’m pleased to announce the release of SwiftText, a text scratchpad app for Mac that makes itself available with a keyboard shortcut – and disappears just as quickly. It’s available now on the Mac App Store; you can see more screenshots on the SwiftText page.

Several months ago I needed an app just like this, but all of the possible solutions I found did too much. I simply wanted a lightweight app to pop up with a keystroke, let me type immediately, and get out of my way again with an escape when I was done. So I wrote SwiftText and found it useful enough that I figured there are probably other people out there with similar needs.

SwiftText is priced at US$1.99, but it’s available through April 8th at a special price of US$0.99. If you decide to give it a go, I hope you find it to be as useful as I have, and don’t be shy about sending your feedback (use the feedback tool in the app, or the address here). I have several ideas cooking for 1.1; I’d love to hear yours.


Acorn + AppleScript: Adding a White Background

March 5, 2011

Say you have a lot of transparent images that you need to add a white background to. Flying Meat’s Acorn has some nice scripting bindings that are up to the task.

Not being a huge fan of AppleScript, I started out writing JSTalk inside an Automator task, but frankly I couldn’t figure out how to fill the new layer. Filling is, however, documented for AppleScript. I got an AppleScript version working, but quickly found that (at least on my system) running AppleScript inside Automator is very slow.

So I abandoned Automator and adapted that code to an AppleScript droplet. After saving it as an application from AppleScript Editor, I had a speedy app bundle that I could drop images on. Here’s the code:

on open input
	repeat with anImage in the input
		tell application "Acorn"
			open anImage as alias
			tell document 1
				duplicate layer 1
				fill with color (65535 & 65535 & 65535)
				merge visible layers
			end tell
		end tell
	end repeat
end open

From Blender to iPhone

February 10, 2011

My first iPhone game, Shufflepuck, was written in the dark ages – before the iPhone SDK was even available – and I modeled the 3D world using the tools I had: basic geometry equations applied to generate all of the vertices programmatically. And it worked! For a good while it was the sexiest table shuffleboard game on the App Store.

I was able to get by on programmatic models for that game, but if I wanted to be able to do something more interesting I was going to need to step things up. I write this now not as anything near an expert, but as a developer who has forced himself to wade into unfamiliar territory and figure out how to put the pieces together. My goal was to create a simple, textured model in a 3D modeling package and display that model in all of its textured glory on the iPhone.

Before I describe the steps I took, there is of course an easier path: using a game engine. There are several for iOS, most of them with their own model loaders and probably tutorials on how to do things end-to-end. For a few reasons, some more valid than others, I decided not to use any of these. They might make loading my textured model ridiculously easy, and there’s certainly value to that, but my research told me I could probably do it without them. Perhaps I just wanted a challenge.

Step 1: Learn Blender

Blender is a popular open source, cross-platform 3D modeling app. If your budget resembles mine, rockstar 3D apps like Autodesk’s Maya simply aren’t an option. So you will suffer with Blender, app of bewildering UI conventions. (I looked into Cheetah 3D briefly but at that stage I had already spent enough time with Blender that I thought there might be hope for it.) I used Blender 2.49b; it is my understanding that quite a bit has changed in 2.5x.

You don’t need to learn all of Blender before you export a model into your app. You just need to have a feeling for how to create objects, select vertices, move vertices, and so forth. Here are my recommendations on how to get there:

  1. Dust off your Mighty Mouse. The developers of Blender did not design it with your Magic Trackpad in mind.
  2. If at all possible, use a full size keyboard with a numeric keypad. The keypad is an important part of 3D view navigation.
  3. Force yourself to complete Unit 1 of Blender 3D: Noob to Pro, as well as Unit 2 sections 2A and 2B.

It will not be easy. Many times you will suspect that you will never feel comfortable in this unsettlingly open sourcy soup of window types and button panels. Particularly if you try to soldier on with your trackpad, as I did, unwarned.

Step 2: Learn Blender UV Map Basics

Now that you know your way around Blender (you do know how to create an icosphere and select a few of its vertices, yes?), skip to UV Map Basics, a later section of Noob to Pro. Frankly this section is not on the same quality level as prior sections, but I managed to cobble together some sense out of it.

Create a little Earth icosphere and setup the UV mapping as described. You should end up with a crude looking earth in Blender’s 3D view.

UV Mapping Earth in Blender 2.49b

Step 3: Get Jeff LaMarche’s Blender Objective-C Export Plugin

Next we need to get that Earth’s vertices, normals and UV texture coordinates out of Blender. There are two ways we could do this:

  1. Export the file as a common 3D object format, such as .obj. If we choose this route however we need some code to read that format in and turn it into vertices, normals, etc. Jeff LaMarche (iphonewavefrontloader) and Bill Dudney (link) have written code to do this, but it sounds like this is not the favored approach given differences in how normals are expressed (vertex vs surface).
  2. Use Blender’s Python API to export your models in a custom format.

Jeff LaMarche has created a script for Blender that does just this: it exports your object as an Objective-C header file. It is perhaps most conveniently used with his OpenGL ES project template, which provides the TexturedVertexData3D structure.

Copy the script in Blender’s scripts folder. On Mac you will find it in Unless you have Finder configured to show dot-files you can get there by opening the folder containing the Blender app bundle in Finder, then Shift-Command-G and enter the aforementioned path.

Now if you switch one of your Blender windows to “Scripts,” you will be able to click Scripts, Export, Objective-C Header and enter a .h file to save the currently selected object to.

Step 4: Incorporate The Model Into Your Project

You should now have a .h file, which contains the vertices, normals and texture coordinates for your icosphere, as well as a texture image, most likely of Earth.

If you followed the UV Map tutorial precisely, your texture image size will be 4096 x 2048, which is both quite large and not square. So you will need to make it smaller and squarer. I made a 1024x1024 texture. Keep in mind that texture coordinates are in the range 0.0-1.0, so when you make the texture square you will want to stretch it vertically so that the V texture coordinates are correct.

If you are using LaMarche’s template, edit GLViewController to:

  1. Instantiate the texture. You will only need to do this once:
texture = [[OpenGLTexture3D alloc] initWithFilename:@"Earth.png"
  1. Bind the texture and draw the vertices. You will find, at the bottom of the generated .h file, the lines necessary to draw the object. You may need to fiddle with the translate and scale values used here, depending on your object.
- (void)drawView:(UIView *)theView
    glColor4f(0.0, 0.0, 0.0, 0.0);
    glColor4f(1, 1, 1, 1);
    [self.texture bind];
    glTranslatef(0, 0, -2);
    glScalef(0.5, 0.5, 1);
    // Quick and dirty hack to make the model rotate:
    static float t = 0.0;
    glRotatef(t, 1, 1, 0);
    t += 1;
    // Drawing code from the header:
    glVertexPointer(3, GL_FLOAT, sizeof(TexturedVertexData3D), &SphereVertexData;[0].vertex);
    glNormalPointer(GL_FLOAT, sizeof(TexturedVertexData3D), &SphereVertexData;[0].normal);
    glTexCoordPointer(2, GL_FLOAT, sizeof(TexturedVertexData3D), &SphereVertexData;[0].texCoord);
    glDrawArrays(GL_TRIANGLES, 0, kSphereNumberOfVertices);

Step 5: Enjoy

You should now have an Earth happily spinning away in your iOS Simulator; at right is my Earth.

1024x1024 is a pretty large texture; you should be able to get away with something smaller, depending on your application and how the model is used. (I chose that size because that’s what it took to look reasonable as a still.)

Hopefully you now have a pretty clear idea of how to use 3D models in your application. You can probably imagine some enhancements. For example, it would be nice to store models as app bundle resources rather than compiling them in as a header file. This is one of the nice things about Blender’s API: if you know a little Python you can write your own exporter.

Perhaps that will be the subject of my next blog post.

Tags: , , ,

Inheritance and Xcode Data Formatters

January 1, 2011

If you’re using data structures in your Xcode project and you’ve had to do any level of debugging, perhaps you’ve experimented with setting up Xcode Data Formatters. Data formatters determine what appears in the summary column of the debugger’s data outline view.

Data formatters are easy enough to setup for simple cases, but what if your classes are in a non-trivial arrangement – subclasses upon subclasses and so forth? Say all of the model objects in your application have the same base class, APModelObject, and on that object you have defined a name value, whose i-var name is mName. You configure a data formatter directly on APModelObject:


Now you can easily see the name of any of your model objects by drilling down to the APModelObject superclass in the data outline. Which gets old. Very quickly. How can we see the name in our APModelObject subclasses? And their subclasses?

You might expect, as I did, that you can simply reference mName again. Unfortunately this gets us a grayed-out summary field, indicating that Xcode doesn’t like our data formatter. The answer is in the above document, disguised as guidance for C++ developers:

In C++, to access a member defined in the superclass of an object, the path to the member must include the name of the superclass. For example, %Superclass.x%.

Even though we’re writing Objective-C, let’s try it. In our APModelObject subclass (call it APDinner), we use this data formatter:


And now we can see the name in that subclass. If we subclass APDinner with APTurkeyDinner, how do we see the name when debugging APTurkeyDinner? You might expect, as I did, to be able to reference the base class directly, but instead we must be very precise:


Setting all of this up gets rather tedious, but at least we only have to suffer through it once for each class. There is a slight shortcut, however. Let’s say that in APTurkeyDinner we only really care about the summary information for APDinner. Perhaps you’ve guessed that we can display its summary using this formatter in APTurkeyDinner:


Now you know all you need to know to have a more pleasant debugging experience with your non-trivial models, be they dinner-themed or otherwise.

Tags: ,

A Manufacturer-Supported Pinball Mod Community

November 27, 2010

Pinball News has an interview with pinball player and Valve game designer Cayle George, in which he states:

One of the biggest things I think is missing from pinball, that is often encouraged in video games, is a mod community that is supported by game developers. Pinball manufacturers should make tools available to the public for modifying the rule set and code of their games. I truly understand the issues, problems, and liability questions this raises, but a successful community of end user game developers would help pinball grow beyond its current marketplace.

Cayle makes an interesting point. Independent mod communities such as P-ROC (with which I am involved) and FreeWPC exist and are showing signs of promise, but it sounds to me like Cayle is more interested in Stern Pinball supporting mods, which is an entirely different animal.

From time to time there is call on the newsgroups for Stern to open source its game software, usually with the suggestion that this is be a no-brainer. As far as I can tell this demand is mostly driven by game owners feeling that their game software as provided by Stern is incomplete (programmers pulled off the project before it is polished, or worse), and that if the source code were available somebody in the pinball community could enhance it.

Without any good information (I don’t know what Stern’s source code/OS looks like), I have a feeling that the chances that somebody in the community would be able to “finish” a game in a reasonable amount of time would be fairly slim, but it’s an interesting notion none the less. The minds of non-developers are prone to vastly oversimplify the amount of work required to make some modification, or add a mode. It’s one thing for a developer to write something new; it’s quite another to get inside another developer’s head and make modifications to their work, particularly in something like a pinball machine, and particularly when it probably wasn’t written with future developers in mind. Pinball software is done once, relatively speaking; there generally isn’t a “2.0” a year or two later with new features that we see in the desktop app world.

Despite these challenges, I think there are three reasons Stern has yet to release their source code:


New versions of game ROMs floating around has high potential to create a nightmare for Stern support. As we have experienced in the P-ROC project, pinball enthusiasts can have a difficult time grasping certain concepts, and as such I can easily imagine game owners running custom software and either not knowing that, or not understanding that this puts them in a position where their game is not supportable by Stern. Let alone fending off support calls from people trying to get the software tool chain setup, or having trouble getting their DMD images in the right format.


It simply doesn’t make good business sense for Stern to do this. Even if they sold SDK licenses, my guess is that the positive impact on their balance sheet would be negligible. It just wouldn’t sell enough additional games. For a company with a product like Stern, open sourcing is a cultural decision, usually made with the understanding that there will be some cost associated, but that it’s worth it in order to suit the ideals of the owners. If we know anything about Stern it’s that they are about making and selling amusement devices, primarily to operators. Not creating warm fuzzies in the community. And they believe this is why they have survived. Why would they change? Why would they put money into creating the documentation to enable people in the community to meaningfully modify the software in their games?


Stern does licensed games. My understanding is that various aspects of such games must be approved by the license-holder. I can’t imagine license-holders being too thrilled about the licensee supporting reworking of licensed assets (sound, music and video) outside of their control. Perhaps this could be overcome if Stern were to make an unlicensed title, but that would be at odds with past behavior.

I think it’s pretty clear that Cayle understands these issues too, and I should make it clear that while I think there are a lot of very good reasons for Stern not to do this (in addition to the question of whether developers in the community could do what other members of the community seem to think they can), I certainly wouldn’t complain if they or some other manufacturer did take a chance on this route. I could even see it as likely for a very small manufacturer, such as Snow Mountain Pinball, to try out. I do agree with Cayle that it would add some much-needed vitality to pinball.


Ripe for Domination: The Mac Twitter Client Market

August 18, 2010

I remember my excitement when Tweetie for Mac was released. On the Mac my Twitter client of choice was the website itself. I used Tweetie on the iPhone, but none of the available Mac clients matched my mental model of what a Twitter client should be. Tweetie for iPhone however was a perfect match. So when Tweetie for Mac became available and I saw that it had the same organization and similarly excellent visual treatments as Tweetie for iPhone, I wasted no time in purchasing my license.

Since its release Tweetie for Mac has become the dominant native Mac client, but the product has not aged well. Updates have been rare and largely focused on bug fixes (thankfully there have at least been those). But indications of the future growth of Tweetie for Mac have been even rarer still.

For me personally what has been most aggravating about Tweetie for Mac over time is its unflinching un-Mac-like-ness in the timeline. Have you ever tried to drag a URL from a tweet into Safari? Or use Snow Leopard’s new Services features to process text? Or get the definition of the word the mouse pointer is hovering over using Ctrl-Cmd-D? Tweetie for Mac draws its tweets as custom views and because of this implementation decision the user can’t take advantage of all of these great built-in features of Cocoa’s text system that are found in almost every application in the system – features that are a big part of what make Macs great. Instead, the users rely on the developer emulating these features as he has done (so far) with only text selection and URL-clicking. I appreciate the effort but emulating the correct behavior will forever be playing catch-up with the system controls.

Common keyboard shortcuts such as Fn-Up/Down Arrow (page up, page down) for navigation are missing as well – another conspicuous inconsistency. Perhaps you can see why the fact that Tweetie for Mac won the 2010 Ars Design Award for “Best Mac OS X User Experience” has mystified me.

Hibari 1.0, a new Mac Twitter client, was released today with various high profile links to it. It has a gorgeous website and attractive screenshots. As you can imagine I was excited to take it for a spin. After all, it has some great features: mute (a sort of silent unfollow), keyword block/filtering, individual tweet hiding and in-timeline conversation expansion.

While I am happy to find that Hibari 1.0’s timeline works just as a Mac user would expect with URL dragging, text services, and so forth, what’s puzzling about it is what it doesn’t offer:

  • You can’t use Hibari to follow new people or unfollow the tedious (@HibariApp was kind enough to respond to my question on this, explaining that they didn’t want to make the context menu any larger).
  • Clicking on any @name opens the user’s Twitter page in your web browser, which is very jarring if you’re accustomed to Tweetie for Mac’s ability to view any user’s tweets without switching apps. An unfortunate side effect of this is that you can only use the in-timeline conversation expansion on your own conversations; if you want to jump into another conversation you’re on your own.
  • Most frustrating, Hibari doesn’t hold your position in the scroll view as new tweets appear. Instead it always shows the most recent tweets in your timeline. (Update: @HibariApp informs us that this is on the way.)

Initially I was very put off that these features were missing. After all, it’s Hibari 1.0. Not 0.9, not Beta 3, but one-point-oh. Calling a product “1.0” and accepting money for licenses signals to the customer that the product is Capital R Ready. You probably have a number of features sketched out for subsequent revisions, but you have presumably included all of the features you view as necessary to use the product. We can only assume that Hibari’s developer, Victoria Wang, did not view these features as necessary.

I qualified the sentence above with “Initially” because as I re-surveyed the top Twitter clients for Mac in writing this article, I realized how unique portions of Tweetie for Mac are. Neither Hibari, Twitterific nor Bluebird (1.0 Beta 3) support viewing an arbitrary user’s tweets. None of them support managing your follow/unfollow status, either. They are focused only on viewing the contents of your timeline, your mentions, and your direct messages.

What I find most disappointing about Hibari is that it has made its 1.0 splash demonstrating some great new features, but it lacks features that Tweetie (and other iPhone clients) have made users like me consider standard. Standard to the point that we will continue using Tweetie (despite its significant flaws) and hopefully check in again on Hibari once it reaches 2.0. As much as I would like to see these features in Hibari sooner than later, I would rather they be designed with the same care that seems to have been taken with the rest of the application.

Today the Mac Twitter client market is Tweetie’s to lose. Hibari shows promise with its filtering preferences but set its sights below providing what I would describe as a complete Twitter client. The draw of Tweetie is not just that it provides a pleasant way to keep up-to-date on my Twitter timeline, but that it provides me with the way I like to consume Twitter best, end-to-end. This is the potential power any native Mac client, for any service. Tweetie has shown us what’s possible; its languishment has been and continues to be a talented developer’s golden opportunity.

Update 9/21/2010: Evan Williams has stated: “We have no plans for an [sic] “official” desktop clients,” which doesn’t seem particularly encouraging for Tweetie’s future.


Tip of the Hat

June 28, 2010

I saw (via Kottke) that Steve Carrell says he will be leaving The Office at the conclusion of the seventh season. As someone who was a fan of the original The Office (the British one, you know) and who was dismayed to learn that The Office was coming to NBC (Are there no good, original ideas for shows waiting to be made? Are we so devoid of creativity that we have to adapt every hit BBC show?), and who, after several years, came to have a certain level of respect for the new adaptation The Office, I am happy to read this news. Not that I had reason to suspect otherwise, but I’m glad somebody over here knows how to quit when you’re ahead.

Part of the charm of the original The Office, part of what made it special, was that it was comprised of two seasons, a wonderful “Christmas special” tying up some loose ends, and poof. The End. Very neat, very very good, and complete. When every indication is that content creators cannot let go of anything, that every property must be propped up again and again for milking (c.f. endless classic movie remakes, even prequels to long-thought-complete trilogies, etc.), it is refreshing to see a message that’s all too rare: My work here is done; it’s time for something new.


Fun with Blocks (in Objective-C)

June 28, 2010

I’ve been working on a Mac application recently that’s 10.6-targeted, which has afforded me a multitude of opportunities to use blocks (Apple’s C language extension introduced in Mac OS X 10.6) to get some tricky features written with a level of grace that previously wouldn’t have been possible.

If you’re a Mac or iOS developer and you’re wondering what good blocks are outside of NSOperationQueue/Grand Central Dispatch, or if you’d like to see some fun ways to take advantage of blocks, check out the most recent installment of Ask Big Nerd Ranch on using blocks in Objective-C, authored by yours truly.

BNRBlockView (gist) is my favorite of the bunch, and the NSThread additions are invaluable building blocks (pun not intended, void where prohibited).


NSBackgroundStyleRaised Rx

May 12, 2010

I was having quite a bit of trouble getting NSBackgroundStyleRaised to work on my NSView subclass’s label, creating the NSTextField programmatically. I tried fiddling with the background color, I tried disabling my drawRect:. Really hard.

The strange thing was that it was working on labels I had created in Interface Builder but was setting the background style on programmatically. I didn’t find the answer until I looked closely at the settings that Interface Builder makes on labels. Here’s the solution:

[textField setDrawsBackground:NO];

And now I have a label that looks great on its superview’s gradient background.


Introduction to pyprocgame

December 7, 2009

Gerry Stellenberg’s P-ROC became available today and I thought this would be a good time to write about the open source software that’s written to work with it. First, though, I want to talk about what P-ROC is and what makes it so cool.

What’s P-ROC (and Why)?

proc-plated-threequarter-160 P-ROC is a circuit board that allows you to control a pinball machine using software on your computer. It’s basically a big digital I/O board with an FPGA at the center that connects to your computer (Mac, Windows, Linux) over USB. P-ROC replaces the CPU board in your pinball machine (or, if you’re making one from scratch, is the CPU board, in a sense) and connects to the switch matrix, the driver board and the DMD itself to provide complete control.

Before P-ROC, if you wanted to create your own pinball game with your own rules, you had to either:

  • Learn to write assembly or rig up a C compiler to generate the requisite machine code and burn your own ROMs (essentially what the FreeWPC project is doing).
  • Write code in a custom high level language designed for use on pre-DMD era 80’s solid state pinball games (such as Ni-Wumph). Or,
  • Design your own architecture from scratch.

Frankly, none of those approaches sounded good to me. I don’t know enough about digital design (much less analog) to create my own hardware, I want to program DMD games, and I’m too big of a fan of higher level languages like Python and Ruby to want to spend all of my time worrying about managing code pages – plus it would be fun to have the option to fiddle with a functional language like Scheme or Erlang for pinball development.

Luckily for me, P-ROC fits the bill. You can imagine that I was pretty excited to find out about it several months ago. Having a few strong ideas about APIs and development tools, and coming from the perspective of someone who knew how he wanted to interact with this board, I made it my business to share my opinions on how the software should be organized. Several months later we have libpinproc, the C interface to P-ROC, and pyprocgame, a set of Python classes built on top of libpinproc that provides a very nice – if I do say so myself – framework for programming a pinball machine.

What pyprocgame Code Looks Like

Let’s just get right to some overly simplistic demo code:

class AttractMode(Mode):

  def sw_startButton_active(sw):
  def mode_started():, 
                                         cycle_seconds=0, now=False)

  def mode_stopped():

You wouldn’t actually write code like this in a real game, but it demonstrates a few of the unique properties of pyprocgame:

  1. Mode class-based architecture. Essentially stackable state machines running in your game. A mode can be as simple as managing state for one shot, or as complex as an entire multiball mode. We leave it up to you.
  2. Easy handling of switch events. Define your switches in a common configuration file, then just add appropriately named methods to your Mode subclass. No need to define a constant, register a handler, etc. pyprocgame even makes more sophisticated switch events, like responding to a switch only after it has been in a certain state for a period of time, really simple: sw_startButton_inactive_for_500ms(sw):. pyprocgame handles the busywork of this kind of stuff for you.
  3. Easy access to game hardware elements like coils, lamps, and switches. The game-specific configuration file you load sets everything up.

The reason you wouldn’t write code like the above in a real game is that in reality modern pinball machines are pretty sophisticated systems. The trough eject call, for example, would likely be much more complicated in a real game (Is there a ball in the trough to eject? Is there already a ball in the shooter lane?). That’s why pyprocgame includes several generic mode classes to help out. Ball search and ball trough helpers, drop targets, service mode, DMD score displays, high score entry, and so on. We’ve also coded up a nice set of support classes for working with the DMD: fonts, animated layers, etc. There’s even great support for setting up your own lamp shows.

You can learn about the details of programming pinball machines with pyprocgame here: pyprocgame Overview. It’s a work in progress, but a lot is there already. And of course, also on GitHub, is the entire source of libpinproc and pyprocgame.

As for example game code, start with Creature to get a taste for how a game is built. There’s also the sprawling Judge Dredd-based demonstration game which is frankly not a great reference. There’s a lot going on and it wasn’t written with the best practices in mind.

What’s Next

It’s been a very rewarding project to work on with Gerry; now that the P-ROC board is available for purchase (and some are already shipping) I’m excited to see what people come up with. It’s been a thrill for me to see what he’s been able to build off of (and add onto) pyprocgame. If you’ll pardon my back-patting one last time, it’s really remarkable how quickly the framework enables you to add useful components to your game. I don’t want to imply that pyprocgame makes pinball development easy, but I think it’s probably the easiest solution out there that gives you the flexibility to do whatever kind of game you want to do.

A number of homebrew pinball hobbyists out there seem to be looking for something fully self-contained and not requiring a full computer. They may be turned off by P-ROC initially, but I think that this is still a great platform if that’s your goal. There are a number of reasonably priced single-board computers available now (such as BeagleBoard) that would probably make an excellent platform for running pyprocgame. Developing the software for a pinball machine is a big job (much less building one). I want to make it easy on myself during development and work on a desktop/laptop, then move to a more embedded platform when the project is ready for that.

Finally, I’m very interested to see if there are developers out there who have their own ideas about what a pinball development framework should look like. That’s why we provided the C libpinproc API: a basic toolkit that’s very close to the hardware, giving the framework developer a clean interface (without any assumptions) on which to build something bigger (and cooler).

I hope to post some video in a future installment (update: here); until then there’s this old drop target demo video.

Go check out the P-ROC site for more about the board, as well as forums.

Updated 2014/12/7: Fixed dead links.

Tags: , ,

Driving a Pinball DMD with an Arduino Microcontroller

December 6, 2009

For quite some time now I’ve been interested in developing my own pinball machine. Obviously this is a massive endeavor. Before I happened across the P-ROC project, over the course of several days in April of 2007 I did some work on driving a pinball dot matrix display (DMD) with an Arduino microcontroller. Here’s what I made; read on for the details of how.

How It Works

Before we get too technical, a brief disclaimer: I am a computer scientist, not a digital design engineer. I’m self-taught when it comes to practical digital I/O knowledge, so some of my terminology may be a bit fuzzy. Also, and very importantly, I cannot take any responsibility for the crazy things you do to your pinball (or computer) hardware. You’re working with a high voltage peripheral here and while I seem to have escaped any ill results, you may not be so lucky. Be sure to ground the Arduino to the pinball machine ground, and keep your finger near the on/off switch. You may need to cut the power a few times before you get it right!

Let’s summarize what’s in the video:

  • A dot matrix display in a Shadow pinball machine. The game is powered up and running, but the DMD’s ribbon cable has been detached from the display controller board and is attached to…
  • An Arduino microcontroller, strapped to a solderless breadboard, and wired to the ribbon cable via the breadboard. I don’t recall exactly how it was wired up, but you can get an idea from the sketch source below. I do recall that there was at least a pull-down resistor on the Display Enable. This is important because if the DMD is left drawing the same row for a long time I suspect it will burn it in. (It shines extremely brightly in a rather bad sort of way.)
  • The Arduino is completely controlling the contents of the display (rudimentary pong in this case).
  • The Arduino is plugged into my Powerbook via USB for 5V power.
  • The DMD is powered by the pinball machine’s high voltage board. It would be possible to run this independent of the pinball machine if you had an appropriate DMD power supply.

The DMD is controlled by six digital I/O lines:

  • Display Enable
  • Row Data
  • Row Clock
  • Column Latch
  • Column Clock
  • Column Data

It’s essentially a serial interface (serial data via Row Data, clocked in with Row Clock), using the other I/Os to control the column position. The DMD works like a CRT. It has no onboard frame buffer; instead it has latches for one row’s worth of data. The Arduino must write these rows of data out constantly in order to maintain a picture. Here’s what the Arduino sketch I wrote does, in pseudocode:

for each row:
  for each 8 columns in this row:
    clock out dot data from bitmap at (column, row)
  latch in the row we just wrote
update bitmap

My initial implementation clocked out the dot data manually, but this ended up being much too slow. Thankfully the Arduino has a hardware SPI interface on pins 10-13 (although only the clock and data pins are needed). This allows you to rapidly clock out 8 bits of data with one call in Arduino C code (done by assigning to the SPDRvariable/register, in UpdateDMD() in the sample code). So the actual code is more like:

for each row:
  for each 8 columns in this row:
    write out those 8 columns all at once via SPI
  latch in the row we just wrote
update bitmap

You can see the source code for my Arduino sketch here: DMDTest.c. It’s very much a sketch; hopefully you can glean the important parts. Again, UpdateDMD() has the important stuff. I derived the background image with the UNIX banner program and a perl script to generate the bitmap array.


This project was a lot of fun to do, and very rewarding when I finally got it right. It took a lot of trial and error to get it working correctly. The question is, though, is this a practical means for driving your homebrew pinball machine’s display?

Frankly I suspect not. My experimentation showed that the Arduino I was using (ATMega 128, I believe) was just barely fast enough to do a reasonable job of keeping the picture bright enough and flicker-less. The DMD is very sensitive to the timing of its driver (the Arduino in this case), so it would prove very tricky (in my estimation) to a) drive the image consistently and b) do something else useful, like read any considerable amount of data off the serial lines (such as new frames), much less do something more involved like drive a lamp matrix, too. You may be much more of an embedded wizard than I am, however, and be able to figure it out.

There’s a very good reason that Williams et al had dedicated hardware to perform this function. Once I realized these limitations I began to imagine the ideal solution to be something like an FPGA driving the DMD off of some sort of shared memory setup. FPGA work is a bit beyond my skills at this point, so that was the end of the trail for me.

I did end up finding a solution that’s essentially done that, and more. It’s more expensive than an Arduino, but is a great fit for my interests (writing high level code on a PC to control a pin): P-ROC. Full disclosure: I have contributed a lot of open source software to this project in the form of pyprocgame and libpinproc. I hope to write a bit more about my work with P-ROC here in the future.

If you do end up implementing your own DMD controller, even on an Arduino, drop me a line about it!

Tags: , ,

Editing a UITableView: The Animated Insert Row

September 2, 2009

This post originally appeared on my Incomplete Labs blog on September 2nd, 2009.

An iPhone app I’m working on has your traditional table view with an edit button in the upper right. I wanted an “Add New Item” row to appear at the bottom of the table once the edit button was tapped (think iPhone contact editor), but I couldn’t find a succinct description of how that’s done. Here’s how I did it.

If you start out with a new UITableViewController subclass, Apple’s template code for the UITableViewController sets us up with what we need to enable the edit button: (it’s commented out in -viewDidLoad)

self.navigationItem.rightBarButtonItem = self.editButtonItem;

If we uncomment this line the newfound edit button will animate our table into a listing of (apparently) deletable rows. What we need is a way to add a row to the end of the section that we want the user to be able to add to. Initially I tried doing this by checking for -isEditing inside -tableView:numberOfRowsInSection:, hoping that the table data would be automagically reloaded when Edit was tapped (it wasn’t):

- (NSInteger)tableView:(UITableView *)tableView numberOfRowsInSection:(NSInteger)section 
    if ([tableView isEditing])
        return [items count] + 1;
        return [items count];

I also added code to -tableView:cellForRowAtIndexPath: to provide the a cell label for the extra row (“Add an item”), also implementing -tableView:editingStyleForRowAtIndexPath: to return UITableViewCellEditingStyleInsert as appropriate. Those steps are important, but they weren’t enough. My data source and delegate weren’t being queried for the updated information and the table view wasn’t showing the new row when the edit button was pressed.

What we need is a way to tell the UITableView that there is a new row, and we need to tell it at the right time. I was tempted to look at setting up my own target/action on the Edit button, but since we’re subclassing UITableViewController the -setEditing:animated: method is a great place to do this. We can just override it in our own subclass, being sure to call the superclass’s implementation:

- (void)setEditing:(BOOL)editing animated:(BOOL)animated
    [super setEditing:editing animated:animated];

Then we call the table view’s -insertRowsAtIndexPaths:withRowAnimation: to tell the table view where the row is being added when we’re editing. There’s a complementary call, -deleteRowsAtIndexPaths:withRowAnimation:, which will work for when we aren’t editing any longer:

    NSArray *paths = @[ [NSIndexPath indexPathForRow:1 inSection:0] ];
    if (editing)
        [[self tableView] insertRowsAtIndexPaths:paths 
    else {
        [[self tableView] deleteRowsAtIndexPaths:paths 

That’s about it. -insertRowsAtIndexPaths:withRowAnimation: and its complement are the key to getting the behavior we want, and we can use them in other scenarios to get smooth table view updates.


Making NSPoint/NSRect/NSSize Objects

January 9, 2008

Updated 2010/12/9 to improve the solution (see below). Updated 2016/8/25 to add Swift.

In Cocoa the common NSPoint/NSRect/NSSize/etc. data types are C structures. But what if you need to pass one as an id data type (an object), as I needed to do this while implementing undo for a Cocoa application? The solution is NSValue, which serves as a basic wrapping/container object for these structures:

NSPoint point = NSMakePoint(x, y);
NSValue *pointValue = [NSValue valueWithPoint:point];
[undoManager registerUndoWithTarget:obj 
    selector:@selector(setPositionObject:) object:pointValue];

In Swift:

let point = NSPoint(x: ..., y: ...)
let pointValue = NSValue(point: point)
    selector: #selector(setPositionObject(_:)), object: pointValue)

On the receiving end you would extract the value like this:

- (void)setPositionObject:(NSValue *)positionValue {
    NSPoint position = positionValue.pointValue;
@objc func setPositionObject(positionValue: NSValue) {
    let position = positionValue.pointValue

If you’re working on iOS, you’ll be using CGPoint instead: NSValue(CGPoint:) and CGPointValue().

Prior to December 9th, 2010 this post suggested that the following was your best approach (“a few extra steps, but it gets the job done,” I said). Thanks to Fred G for the (embarrassing) correction. I reproduce it here because it does demonstrate a useful technique but is hardly the most sane way to go about it.

NSPoint point = NSMakePoint(x, y);
NSValue *pointValue = [[[NSValue alloc] initWithBytes:&point; 
    objCType:@encode(NSPoint)] autorelease];
[undoManager registerUndoWithTarget:obj 
    selector:@selector(setPositionObject:) object:pointValue];

I suspect that this escaped me at the time because of Apple’s habit of separating documentation for category methods (which valueWithPoint: is) from the normal class documentation. Thanks for improving upon that, Apple.