Teaching coding is hot. Codecademy famously signed up New York’s mayor as a user (and recently roped in $10m of venture capital), and the popular online-teaching concern Khan Academy just launched a suite of programming lessons. Backlash followed: career programmers have scoffed at the idea of “coding as literacy,” while an academic study claimed that some people can code, and others simply can’t.
Bret Victor, a former interface designer for Apple, thinks they’re all wrong. In a devastating and persuasive interactive essay, he argues that DIY programming sites are mostly useless because of their opaque interfaces. (Using Khan’s online tutorials, he writes, is like trying to learn how to cook by stabbing at random buttons on an unlabeled microwave.)
But that’s only half the problem. Victor thinks that programming itself is broken. It’s often said that in order to code well, you have to be able to “think like a computer.” To Victor, this is absurdly backwards—and it’s the real reason why programming is seen as fundamentally “hard.” Computers are human tools: why can’t we control them on our terms, using techniques that come naturally to all of us?
The main problem with programming boils down to the fact that “the programmer has to imagine the execution of the program and never sees the data,” Victor told me. To illustrate this problem, he makes another vivid analogy to cooking:
Obviously, this is a terrible way to teach a beginner how to cook a dish. Victor’s bigger insight is that it’s a terrible way for an experienced chef to create new dishes, too. Chefs aren’t forced to perfectly simulate everything between “ingredients” and “new souffle” in their heads before touching any of the food; nor would they expect everything that happens in between “assembling the ingredients” and “pulling the souffle out of the oven” to be concealed from them. But this is how a lot of the work of programming actually happens, according to Victor.
No wonder that programming seems like something that only “those with a freakish knack for manipulating abstract symbols” could ever do. We’ve set it up that way.
Victor’s essay is dense with interactive examples of how to fix this (they’re not embeddable, unfortunately, so definitely click through to check them out): from redesigning IDEs so that they visualize the flow of data from one instruction to the next, to rethinking the basic metaphors that undergird programming languages themselves. (Spoiler alert: the 45-year-old Logo gets a lot more love than Processing.)
But to what end? Is programming really all that screwed up? So what if “us normals” never really are able to get the hang of it—aren’t the current approaches for learning and writing code adequate for generating enough programmers to keep the world turning?
Well, yes. But only if you subscribe to the “programmers as plumbers” analogy, which Victor emphatically does not. “Bill Atkinson (creator of Hypercard) wanted creating software to be like drawing or writing – something that everyone could do,” Victor says. “We expect everyone to be able to write a letter. The ability to create your own software for your own uses is very powerful.”
Victor isn’t just talking about rolling your own Twitter client. What’s truly powerful is the ability to build/write what he calls “explorable explanations”: interactive simulations of your own ideas and arguments—or those of others. What if the next op-ed you read about global warming or nuclear safety included a built-in model for examining its claims? This is literally programming as literacy—that is, an amplification of thinking—with the potential to be made available to anyone. But only if we step outside the corners we’ve painted ourselves into about what programming is, how to do it, and how to teach it.
Still, millions of developers aren’t suddenly going to rethink how they do their jobs; millions of tools, apps, programming languages and IDEs can’t be redesigned overnight. Victor says that’s just more reason to start chipping away at the foundations now. “Thomas Kuhn, author of Structure of Scientific Revolutions, suggested that when a paradigm shift occurs, the old scientists never actually adopt the new way of thinking,” Victor says. “They just retire, and are replaced by a new generation who thinks in the new way.”