Skip to Content
Computing

We are awash in digital light

In a book by Pixar co-founder Alvy Ray Smith, the humble pixel gets the attention it deserves.

October 27, 2021
digital creativity concept
Andrea Daquino

The computer scientist Alvy Ray Smith cofounded both Lucasfilm’s computer graphics division and Pixar Animation Studios. For those achievements alone, he is one of the most important technological innovators in cinema since at least the end of the Second World War. But Smith is not a Hollywood guy, and his intriguing, foundational new book A Biography of the Pixel is not a Tinseltown book. There are only the slightest morsels of gossip (Steve Jobs was a difficult man to work with—confirmed!), and the only marquee celebrity who appears in Smith’s story with any frequency is George Lucas. Smith isn’t interested in fame. He’s chasing more profound themes, arguing in effect that the great project he was part of—the invention and development of computer graphics—is far more important than anything that ever happened in Hollywood.

Smith is what used to be called a “graybeard” in computer programming circles. He’s from that generation of engineers and coders who watched the digital age rise from the swamps of secret military projects and the space program to conquer the world. He has spoken machine language. He marveled at the first crude graphics to exhibit motion on green-and-black screens. And he was among the first to demonstrate the newfound ability of a stylus to trace a smooth curve of digital “paint.”

In A Biography of the Pixel, Smith’s aim is to set down clearly the trajectory of two important, intertwined stories. The first story is the development of computer images, from origin to digital ubiquity. There are, in Smith’s telling, many names, places, and breakthroughs missing from the record, and he has taken on the job of adding them back in with an engineer’s eye for precision. The second story, unfolding in parallel, is about the impact of those images—a transformative force Smith calls “Digital Light.” It encompasses basically everything we experience through screens, and he argues convincingly that it is among the most important innovations in human communication since the first simple depictions of daily life were etched on the walls of caves.

The humble pixel

As Smith demonstrates repeatedly, far too much credit has been allowed to slide to the supposed wizardry of individual geniuses. The reality is a muddy, overlapping history of groups of inventors, working by turns in competition and in collaboration, often ad hoc and under considerable commercial or political pressure. 

Thomas Edison and France’s Lumière brothers, for example, were great promoters and exploiters of early film technology. Both exhibited full systems circa 1895 and were happy to claim full credit, but neither built the first complete system of camera, film, and projector all (or even mostly) on their own. The real answer to the question of who invented movies, Smith writes, is a “briar patch” of competing lineages, with parts of the system developed by erstwhile partners of Edison’s and similar parts by a handful of French inventors who worked with the Lumières. 

Among the crucial figures relegated to history’s dustbin were William Kennedy Laurie Dickson (an odd European aristocrat who designed and built the first movie camera for Edison) and Georges Demenÿ (whose design was copied without credit by the Lumières). Smith shows perhaps too much of his exhaustive work in rescuing these convoluted origin stories—there are similarly tangled muddles at every major stage in the development of computers and graphics—but his effort to set the historical record straight is admirable. 

The main drawback of all this wrangling with the egos and avarice of several generations of forceful men (they are, alas, virtually all men) is that it sometimes distracts Smith’s focus from his larger theme, which is that the dawn of Digital Light represents such a rare shift in how people live that it deserves to be described as epochal. 

Digital Light, in Smith’s simplest definition, is “any picture composed of pixels.” But that technical phrase understates the full import of the “vast new realm of imagination” that has been created by its rise. That realm encompasses Pixar movies, yes, but also video games, smartphone apps, laptop operating systems, goofy GIFs traded via social media, deadly serious MRI images reviewed by oncologists, the touch screens at the local grocery store, and the digital models used to plan Mars missions that then send back yet more Digital Light in the form of jaw-dropping images of the Red Planet’s surface. 

And that barely begins to cover it all. One striking aspect of Smith’s book is that it invites us to step just far enough back from the constant flow of pixels that many of us spend most of our waking hours gazing at to see what a towering technological achievement and powerful cultural force all this Digital Light represents.

Fourier contributed the insight that everything we see could be described as the sum of a series of waves. Or, as Smith more poetically phrases it, “The world is music. It’s all waves.”

The technological breakthrough that made all this possible is, as Smith’s title suggests, the humble pixel. The word itself is a portmanteau of “picture element.” Simple enough. But the pixel has been mischaracterized in popular usage to refer to the blurry, blocky supposed inferiority of poorly rendered digital images. Smith wants us to understand that it is, rather, the building block of all Digital Light—a miraculous, impossibly varied, endlessly replicable piece of information technology that has literally changed how we see the world. 

The misunderstanding begins, Smith explains, with the fact that a pixel is not a square, and it is not arranged alongside other pixels on a neat grid. Pixels can be rendered on displays as such, but the pixel itself is “a sample of a visual field ... that has been digitized into bits.” The distinction might sound esoteric, but it’s crucial to Smith’s argument for the pixel’s revolutionary impact. The pixel is stored information that any device can display as Digital Light. And digital devices can do this because pixels are not approximations but carefully calibrated samples of a visual field, which has been translated for digital uses into a collection of overlapping waves. These pixels, Smith writes, are not reductions of the visual field so much as “an extremely clever repackaging of infinity.”

The new wave

The process by which a pixel generates Digital Light—whether in the form of words on a screen or an icon on a smartphone or a Pixar movie on the big screen—is built on three mathematical breakthroughs that predate the modern computer. The first of these was achieved by Jean Joseph Fourier, a French aristocrat and regional governor under Napoleon in the early 1800s. Fourier contributed the foundational insight that not just sound but heat and everything we see and much else could be described as the sum of a series of waves, representing various frequencies and amplitudes. Or, as Smith more poetically phrases it, “The world is music. It’s all waves.” 

More than a century later, a Soviet engineer named Vladimir Kotelnikov built on Fourier’s wave principle with the second crucial element for creating Digital Light—the “Sampling Theorem.” Kotelnikov demonstrated that a signal—be it a piece of music or a visual scene—can be captured by taking snapshots (“samples”) at certain intervals. Take enough samples of some aspect of a visual field—its gradation of color, for example, or shifts from foreground to background—and it is possible to reconstitute the entirety of the information. Smith acknowledges that American computer scientists are taught that the sampling theorem originates with Harry Nyquist and Claude Shannon, but “the great idea ... was first clearly, cleanly and completely stated by Kotelnikov in 1933.”

The third element that made Digital Light possible is the best known and most recently developed: Alan Turing’s 1936 paper outlining the universal computing machine, whose great innovation was the ability to execute any systematic process as long as it has the right accompanying set of instructions (which we now call software). A Turing machine, the basis of the modern computer, can be programmed to understand the process by which Fourier’s waves had been sampled by Kotelnikov’s theorem, and to reproduce them on any other Turing machine. These three elements together begat Digital Light.

Digital Light on its own, though, was a limited force. Its earliest manifestations were simple pictographs on the digital cave wall of a TV screen. In December 1951, for example, MIT’s Whirlwind computer displayed an array of white dots on a black screen for the CBS program See It Now, hosted by Edward R. Murrow. The dots spelled out “Hello Mr. Murrow,” slowly fading and then brightening again, like a Lite-Brite on a dimmer switch. Clever, even wondrous for its time, but not the upheaval at the core of Smith’s book. For that, Digital Light needed one more element: unimaginable speed.

Computer graphics, Smith explains, are just crazily long lists of numbers that correspond to graphical coordinates—pixels, nowadays, but thousands and thousands of tiny interlocking triangles in the earliest manifestations—assembled in digital space into the three-dimensional form of a Pixar cartoon character or anything else. (The first 3D computer graphic assembled from these triangles was, famously, a teapot.)

The great digital convergence

Such wonders as 3D animation, however, weren’t possible until computer processing power exploded. Smith recounts the ensuing transformation with an engaging mix of technical detail, deep research, and personal recollection. Several generations of mathematicians, coders, and lab rats contributed to the development of computer graphics, building new tools and machines as Moore’s Law rapidly made it easier to turn Fourier’s waves and Kotelnikov’s samples into geometric shapes, simple pictures, and basic motion on a screen. Disney and Lucasfilm and Stanford University loom large, of course, but so do NASA and General Motors and Boeing (which pioneered computer-­aided industrial design), as well as lesser-known hives of computer graphics genius like the University of Utah and the New York Institute of Technology (NYIT).

Smith’s own transition from simple pixels to digital movies started at NYIT in the early 1970s. There, he helped establish one of the world’s first computer graphics labs, along with several of the other cofounders of Pixar, before moving on to introduce the technology to Lucasfilm. (He worked on the very first computer-animated sequence Lucasfilm produced, a special-­effects sequence for the movie Star Trek II: The Wrath of Khan.) 

Throughout the journey, Smith remained focused on the ultimate prize of producing a full-length digital movie. He wanted these tools to be used to create great art, to give form to the creative genius of minds the world over. Pixar achieved that goal with the 1995 release of Toy Story, the first feature-length film to be completely computer animated. And not long after that, an even more momentous achievement was reached—the pivotal moment Smith calls “the Great Digital Convergence.”

What happened is both easy to explain and hard to fully comprehend. What happened was people started messing around with it.

This is the point, sometime around the year 2000, when all pictures (moving and otherwise) could be universally represented by pixels. “Quietly and unremarked,” he writes, “all media types converged into one—the universal digital medium, bits.”

Reading Smith’s account of this convergence, I found myself thinking of a famous quote attributed to the French writer and filmmaker Jean Cocteau. “Film will only become an art,” Cocteau said, “when its materials are as inexpensive as a pencil and paper.” This, in part, is what Smith is driving at when he asks us to look in awe upon the power of the pixel. And that recollection led me—inexorably, really—to thinking about the “Steamed Hams” meme. 

For the uninitiated, Steamed Hams was born as a short vignette in an episode from the seventh season of The Simpsons, “22 Short Films About Springfield,” which first aired in 1996: Springfield Elementary School’s dorky Principal Skinner hosts his boss, Superintendent Chalmers, for a luncheon at his home. The two minutes and 42 seconds of the vignette unfold as an escalating series of minor disasters, leading Skinner to sneak off to Krusty Burger and then claim the fast-food meal as his own. Having promised the superintendent steamed clams, Skinner covers for his ruse by claiming that he had actually said he was making “steamed hams,” which he suggests is regional slang for hamburgers in upstate New York.

It’s a silly little snippet from an offbeat Simpsons episode, and it earned no particular attention until the Great Digital Convergence placed the tools of digital filmmaking in the hands of virtually anyone with a computer and an internet connection. And then what happened is both easy to explain and hard to fully comprehend. What happened was people started messing around with it. 

The creative force unleashed

The birth of the Steamed Hams meme appears to have been a short clip from the vignette, reproduced using a text-to-movie app and posted to YouTube in March 2010. In the years since then, as the digital tools for producing and disseminating short videos improved at the breakneck speed of Moore’s Law, the meme metastasized wildly. The clip has been piled upon itself, the YouTube screen divided into 10 boxes, each playing Steamed Hams on a short delay, as if being sung in a round, until it dissolves into roaring cacophony. It has been set to a wide range of pop songs—my favorite is one in which Auto-Tune software (itself a product of the convergence) has been used to bend and morph the dialogue so that it somehow sticks to the melody of Green Day’s hit song “Basket Case.” It has been layered over various video games. One enterprising Steamed Hams fan persuaded actor Jeff Goldblum to read the entire vignette’s script in his distinctive diction; the resulting YouTube clip cuts expertly from Goldblum’s live reading to the original animation, sometimes in split-screen. There is a sort of remake of Steamed Hams in which a different animator renders every 13 seconds of the vignette in an entirely different style. This is only a small sample of the highlights. The meme is massive. 

If I were ever asked to teach a class in postmodern art, I would hold up the entire meme as a signature example of the staggering creative force unleashed by the Great Digital Convergence. Thanks to tools not that much harder to obtain than a pen and pencil, the internet now hosts an impossible abundance of inventive riffs: GIFs and clips, supercuts and mashups, reboots and remixes. A whole world of casual creators making digital movies, using brand-new tools that have already become so commonplace we barely notice them. We are at home in Digital Light.

Cocteau’s world of ubiquitous cinematic creation, that is, may very well be here. This is what Alvy Ray Smith was building toward for half a century in pursuit of that first digital movie. We’ve arrived. We are all auteurs. Go play.

Chris Turner is an author and essayist based in Calgary, Alberta. His most recent book is The Patch: The People, Pipelines, and Politics of the Oilsands.

Deep Dive

Computing

Inside the hunt for new physics at the world’s largest particle collider

The Large Hadron Collider hasn’t seen any new particles since the discovery of the Higgs boson in 2012. Here’s what researchers are trying to do about it.

Why China is betting big on chiplets

By connecting several less-advanced chips into one, Chinese companies could circumvent the sanctions set by the US government.

How Wi-Fi sensing became usable tech

After a decade of obscurity, the technology is being used to track people’s movements.

VR headsets can be hacked with an Inception-style attack

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.