After inventing calculus, actuarial tables, and the mechanical calculator and coining the phrase “best of all possible worlds,” Gottfried Leibniz still felt his life’s work was incomplete. Since boyhood, the 17th-century polymath had dreamed of creating what he called a characteristica universalis—a language that perfectly represented all scientific truths and would render making new discoveries as easy as writing grammatically correct sentences. This “alphabet of human thought” would leave no room for falsehoods or ambiguity, and Leibniz would work on it until the end of his life.
A version of Leibniz’s dream lives on today in programming languages. They don’t represent the totality of the physical and philosophical universe, but instead, the next best thing—the ever-flipping ones and zeroes that make up a computer’s internal state (binary, another Leibniz invention). Computer scientists brave or crazy enough to build new languages chase their own characteristica universalis, a system that could allow developers to write code so expressive that it leaves no dark corners for bugs to hide and so self-evident that comments, documentation, and unit tests become unnecessary.
But expressiveness, of course, is as much about personal taste as it is information theory. For me, just as listening to Countdown to Ecstasy as a teenager cemented a lifelong affinity for Steely Dan, my taste in programming languages was shaped the most by the first one I learned on my own—Objective-C.
To argue that Objective-C resembles a metaphysically divine language, or even a good language, is like saying Shakespeare is best appreciated in pig latin. Objective-C is, at best, polarizing. Ridiculed for its unrelenting verbosity and peculiar square brackets, it is used only for building Mac and iPhone apps and would have faded into obscurity in the early 1990s had it not been for an unlikely quirk of history. Nevertheless, in my time working as a software engineer in San Francisco in the early 2010s, I repeatedly found myself at dive bars in SoMa or in the comments of HackerNews defending its most cumbersome design choices.
Objective-C came to me when I needed it most. I was a rising college senior and had discovered an interest in computer science too late to major in it. As an adult old enough to drink, I watched teenagers run circles around me in entry-level software engineering classes. Smartphones were just starting to proliferate, but I realized my school didn’t offer any mobile development classes—I had found a niche. I learned Objective-C that summer from a cowboy-themed book series titled The Big Nerd Ranch. The first time I wrote code on a big screen and saw it light up pixels on the small screen in my hand, I fell hard for Objective-C. It made me feel the intoxicating power of unlimited self-expression and let me believe I could create whatever I might imagine. I had stumbled across a truly universal language and loved everything about it—until I didn’t.
Twist of Fate
Objective-C came up in the frenzied early days of the object-oriented programming era, and by all accounts, it should have never survived past it. By the 1980s, software projects had grown too large for one person, or even one team, to develop alone. To make collaboration easier, Xerox PARC computer scientist Alan Kay had created object-oriented programming—a paradigm that organized code into reusable “objects” that interact by sending each other “messages.” For instance, a programmer could build a Timer object that could receive messages like start, stop, and readTime. These objects could then be reused across different software programs. In the 1980s, excitement about object-oriented programming was so high that a new language was coming out every few months, and computer scientists argued that we were on the precipice of a “software industrial revolution.”
In 1983, Tom Love and Brad Cox, software engineers at International Telephone & Telegraph, combined object-oriented programming with the popular, readable syntax of C programming language to create Objective-C. The pair started a short-lived company to license the language and sell libraries of objects, and before it went belly up they landed the client that would save their creation from falling into obscurity: NeXT, the computer firm Steve Jobs founded after his ouster from Apple. When Jobs triumphantly returned to Apple in 1997, he brought NeXT’s operating system—and Objective-C—with him. For the next 17 years, Cox and Love’s creation would power the products of the most influential technology company in the world.
I became acquainted with Objective-C a decade and a half later. I saw how objects and messages take on a sentence-like structure, punctuated by square brackets, like [self.timer increaseByNumberOfSeconds:60]. These were not curt, Hemingwayesque sentences, but long, floral, Proustian ones, syntactically complex and evoking vivid imagery with function names like scrollViewDidEndDragging:willDecelerate.
Objective-C’s objects, meanwhile, were adorned with all-caps prefixes that proudly identified their creator. Some bore household names, like the button to log in to another service with yourTwitter account (TWTRLogInButton) or the add-friends-from-Facebook pop-up (FBFriendPickerViewController). By the time I learned Objective-C, NeXT hadn’t existed for over 15 years, but code from its NeXTSTEP operating system was so ingrained in Apple’s products that its prefix appeared in dozens of objects and functions I used every day—NSDictionary, NSArray, NSString, NSLog.
Objective-C is wordy—arguably excessively so—and this proclivity soon crept into my own outlook. How could an engineer tell a computer exactly what to do without using lots of words? How could a language be universally expressive without being maximally specific? Objective-C’s loquaciousness was not outdated—it was an ethos worth striving for, no matter how much it hurt my wrists.
The Aging Giant
The first and only software engineering job I had (before eventually leaving to the squishier world of technology policy) was developing iPhone apps for an Aging Giant of Silicon Valley. The company had been white-hot shortly after the dialup internet era but missed several tech booms since then and in 2013 was determined not to miss the latest craze: mobile apps.
The app I worked on was only a few years old, but already its codebase told the company’s whole history with unflinching honesty in rambling lines of Objective-C prose. Distinct prefixes gave away which code had been inherited from acquired startups and revealed a bitter conflict over switching analytics platforms. Ornate function names told of product pivots and the defunct pop-up screens they left behind.
But the longer I spent writing Objective-C, the more I felt it hid rather than revealed. Long, sentence-like function names buried the most pertinent information under a fog of dependent clauses. Small features required long-winded pull requests, making it easy for engineers to get distracted during reviews and to miss bugs. Objective-C’s excess words, multiplied across thousands of files and millions of lines of code, made for an exhausting codebase.
Soon enough, my affection for Objective-C’s “more is more” theory of self-expression disappeared completely. As the codebase expanded, its web of objects grew into a tangled thicket of convoluted relationships that bred mysterious, untraceable superbugs. The buzz of messages between objects rose to a cacophony, and the bulk of my job became figuring out what object sending what message to whom made the app crash or the goddamn settings screen look so ugly.
Barely a year and a half into writing Objective-C professionally, I was already having a crisis of faith. I became a software engineer to chase the exhilarating power of turning words into images on a screen, but those words had gone from empowering to burdensome. Even Objective-C’s prefixes, which I once felt told an enchanting story, felt decadent—why did I have to type “NS” hundreds of times a day to pay homage to Steve Job’s long-defunct startup? I was not alone: Mac and iPhone developers everywhere were frustrated with being forced to use this ancient, prattling language. Apple, as it turns out, was ready for change, too. I, however, was not.
Death and Rebirth
Leibniz first wrote about characteristica universalis in his doctorate thesis when he was 19 and worked on it for nearly 50 years until shortly before his death at 68. He reimagined the idea of an “alphabet of human thought” countless times, taking inspiration from mathematics, symbolic logic, hieroglyphics, musical notes, astronomical signs, and the four elements (earth, air, fire, and water). As his knowledge of the physical and metaphysical worlds grew, Leibniz had to continually reconceptualize what it meant to build a system that perfectly reflected the universe.
Programmers, in their pursuit of ever more expressive and efficient code, undergo similar rebirths. When the shortcomings of a particular coding language become clear, a new reactionary language hyper-fixated on solving those problems rises, until it too becomes hegemonic, and the cycle continues. The accepted tenets of what make for expressive code evolve and change alongside technological advancements, leading programmers to become linguistic nomads.
The end came for Objective-C in June of 2014, when Apple announced the debut of Swift, a new programming language for building native apps on all Apple products. Swift did away with what iPhone and Mac developers hated the most about Objective-C: No more square brackets! No more NS! Short, declarative code for all! Although still object-oriented (the cult of functional programming was yet to assert its dominance), Swift had a philosophy of self-expression that ran directly counter to Objective-C’s: verbosity hides meaning, concision reveals truth.
Despite my growing distaste for Objective-C, the idea of learning a new language failed to excite me, and I knew that my days as a software engineer were numbered. The job of the programmer, I learned, is to forever chase your characteristica universalis, despite knowing it will always elude you, just as it did Leibniz. I wanted to chase other things, and would write Objective-C until the end of my software engineering days.
Before I quit my job at the Aging Giant and returned to school, a recent computer science graduate joined my team. He had spent the summer learning Swift and was eager to rewrite our codebase from scratch. Wearing a hoodie that had not yet been stained, he saw Swift as a divine language, clean in its communication. He had just so happened to stumble upon a universal form of expression, and he could do anything.