Mark Cerny’s soft voice and youthful looks belie the position of power he holds in the video-game industry. The 49-year-old Californian is the lead architect of Sony’s PlayStation 4, the company’s forthcoming video-game console cum entertainment hub, which is destined to arrive in millions of living rooms around the globe this winter. As such he is partly responsible for defining the next generation of video-game consoles and shaping the broader influence of these increasingly pervasive devices. It is a unique challenge in technological design. Unlike PCs, smartphones, or televisions, new video-game consoles launch only intermittently, every seven years or so. The design must be robust enough to remain relevant in a rapidly shifting technological landscape over an extended period.
Finding the right balance is a high-risk game: at launch, the PlayStation 4 will go up against Microsoft’s Xbox One, its principal rival, which is also slated for release in December. The stakes for both companies extend beyond video games. Both Sony and Microsoft harbor an ambition to “control” the living room via their machines, which will act not only as game consoles but also as central hubs through which households access television shows, movies, sports, and music. Microsoft is eager to stress Xbox One’s multimedia capabilities, dubbing the system an “all-in-one entertainment system” to rival the Apple TV and Google TV platforms. But “play” remains at the heart and brand of the PlayStation, and Sony believes that the quality and quantity of the system’s games will ultimately win this war.
How does one create the blueprints for a system that can last the distance without becoming outdated? How do you build an architecture that is straightforward enough for third parties to create games with, but also innovative enough to facilitate bold, eye-catching invention? For Sony, whose three previous PlayStation systems have sold an estimated 335 million units, these are multimillion-dollar questions—and the Japanese company has tasked Cerny with answering them. His approach is shaped by a deep passion for innovative games, and by his experience making simple but addictive arcade games.
Cerny’s talent for programming surfaced early. At age five he taught himself to code on a CDC 6400 mainframe computer at the University of California at Berkeley, where his father worked as a lecturer in nuclear chemistry. At 13 he began to audit math and physics classes at the university, and at 16 he joined full-time. “I was quite a good student, but I was bored,” he says.
As well as a talent for programming, Cerny had a talent for arcade games, the new and vibrant industry launched when Atari founder Nolan Bushnell installed his first arcade cabinet, Computer Space, in the Dutch Goose bar near Stanford University in 1971. When Cerny saw Space Invaders in a local arcade in 1978, he was immediately entranced and worked to become “one of the best players in the United States at that time.” This skill brought Cerny to the attention of the author Craig Kubey, who in 1982 was researching a book of arcade game tips and interviews. “He was touring the arcades looking for hotshot players, visiting game companies and interviewing game creators,” explains Cerny. “I was looking for a way to turn my hobbies into a job, and Kubey agreed to mention me to Atari during one of his interviews.” Kubey was true to his word, and within weeks, Cerny was invited for an interview. At just 17 he joined Atari as one of the company’s 15 star programmers—the only employees responsible for both code and game design.
Cerny’s was a family of high-achieving academics. Both of his parents and his brother have PhDs, as do four of his stepsiblings. In that environment, quitting education to make video games at 17 was akin to running away with the circus. “Certainly everyone would have liked to see me complete my higher education,” he says. “But I only thought I’d be at Atari for a year, gaining some experience. It was seven years before I realized I wasn’t going back to college. My family eventually came to terms with it when it became clear I could make enough money in games to support myself.”
Cerny cut his teeth on the game Major Havoc, and at 18 he was given carte blanche to create his own game. “They sat me down and told me to figure out what game I wanted to make and what hardware it would need to run on,” he says. “I was told that if it needed some artwork, they could probably spare somebody for a couple of days. But it was pretty much one person per project.” Cerny’s interest in emerging technology—the same interest that marked him out to Sony as the ideal candidate to design PlayStation 4 three decades later—was evident in his first idea. “Marble Madness started life as miniature golf played via a touch screen,” he explains. “Then we added a trackball that people could roll with their hand to directly control the marble. Initially it was a motorized trackball, but the costs proved prohibitive.”
The latest PlayStation 4 controller shows efforts at interface innovation: it includes a small touchpad as well as more sensitive motion sensors, allowing new ways to play games. The Xbox One, of course, comes with Microsoft’s Kinect, a hands-free motion-sensing device.
When Atari games were 80 percent complete, one or two cabinets would be installed in local bars for live play-testing. “We’d watch people play the game in secret, see if it was too hard or too easy,” he recalls. If the game didn’t prove popular enough it was canceled at this point; two out of every three games didn’t make it. Marble Madness, however, became one of the smash hits in the arcade in the mid-1980s.
Flushed with success, Cerny quit Atari to start work on his own games as an independent developer. But working simultaneously on the hardware and software proved tremendously time-consuming for one man. After 18 months, he dropped the project and moved to Japan to become a contractor for Sega, creating games for its Master System console. “It was like night and day,” he says of the change in corporate culture. “At Atari it was all about creativity; if the concept wasn’t 100 percent original, you couldn’t make it. Sega was about shoveling the titles out the door. We made 40 games, but by my judgment, only two were really worth playing. We didn’t get out of that churn philosophy until Sonic the Hedgehog.”
The shift from arcades to home consoles was changing the way games were designed. Where arcade games had to “kill the player three times in three minutes” in order to earn money, home consumers wanted longer and more accessible games. Cerny left Sega and returned to California to join Universal in the mid-1990s as vice president of the studio’s interactive group. Even in this management position he was still programming games and designing levels. It was during this time that he met Shuhei Yoshida, a producer in Japan who is now head of Sony’s worldwide studios. Yoshida carried out consumer testing on Cerny’s first project, Crash Bandicoot. “He gave me the testers’ notes,” says Cerny. “It was a litany of criticisms of the game by people who were obviously frustrated by its difficulty. It hit me that arcade-style games were not the sort of products we should be making anymore.”
This act of having to relearn a design approach in a changing world has defined Cerny’s career. Today Cerny is back to working simultaneously on hardware and software as lead architect on PlayStation 4 and designer on one of its launch titles, Knack—a bright and colorful platform game that harks back to Cerny’s work on Crash Bandicoot and has little of the grit and violence of most contemporary video-game blockbusters. “Today’s games are enormously complex,” he says. “The PlayStation 4 controller has 16 buttons and a blockbuster game uses almost all of them. I’ve had decades to get used to the increasing complexity of video games. But these days children learn how to play games on iPads and smartphones, which are buttonless. So we have a gulf between the beginner players and the blockbuster game players. I wanted to make a game and a system that acts as a bridge between the two.”
For Cerny, the key to PlayStation 4’s success when it launches this holiday season is in offering a breadth of experiences, both the sprawling blockbuster epics of the mega-studios and the smaller independently created titles from today’s clutch of bedroom programmers. Sony’s commitment to the so-called indie scene is full-throated and in apparent contrast to Microsoft, which has attracted criticism from some quarters for its seeming lack of interest and support. “We have an opportunity to fundamentally alter the landscape of gaming by bringing these diverse titles together,” says Cerny. “I believe there is a much richer set of game experiences on the horizon.”
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
The Biggest Questions: What is death?
New neuroscience is challenging our understanding of the dying process—bringing opportunities for the living.
How to fix the internet
If we want online discourse to improve, we need to move beyond the big platforms.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.