Digital Archaeologists Excavate Chips, Not Dirt
Want to know how the 6502 CPU, the heart and soul of the beloved Commodore 64, Atari 2600, Apple II and even Nintendo Entertainment System actually worked?
Too bad. After 30 years, even the guys who designed this chip don’t remember how it works. All that’s left are some sketchy paper schematics, and they’re not terribly helpful.
But a new field, “digital archaeology” rides to the rescue. Rather than pick and trowel, its tools are successive acid baths, which strip away one layer after another of a chip, revealing the guts of the microprocessors that launched the personal computing revolution.
As outlined by Nikhil Swaminathan, senior editor at Archaeology magazine, the process was pioneered by brothers Barry and Brian Silverman, as well as Greg James, a software engineer. They’ve chronicled the results of their work at Visual6502.org, where they reveal that their understanding of the 6502 has become so sophisticated that they have not merely mapped all of its transistors and connections, they’ve actually managed to simulate the workings of the entire chip.
One fan of their work even managed to reproduce their design in a field-programmable gate array, a physically configurable chip that subsequently ran Atari 2600 games without a hitch.
To say that their results are as beautiful as they are historically significant is to underestimate the weird power of cruising across the surface of a 100mb max-resolution blowup of one of the support chips that powered the Nintendo we all grew up on.
Without work like this, it’s entirely possible that understanding of by-now “ancient” technologies could disappear from our collective knowledge forever.
“Digital media will not survive by accident,” explains [archaeologist Christopher] Witmore. “If you leave a 3.5-inch floppy disk in a tomb next to a rolled-up papyrus, you can unroll that papyrus and engage with it in a way that you can’t with a floppy, which requires you to bring other materials to bear,” like a particular computer or knowledge of a chip capable of reading the data on the disk.
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.