The rapid increase in the capacity of storage and memory technologies has had a remarkable impact on computing in recent years. Many of today’s most popular consumer electronics are only possible because of the availability of cheap, high-density memory.
Examples include iPods capable of holding up to 15,000 songs, flash memory cards in digital cameras that store hundreds of photos, and DVDs able to hold full-length movies with ease.
Storage technologies have for decades enjoyed their own version of Moore’s Law. Moreover, the growth in storage capacity was driven by the simultaneous advancement of several different technologies, including magnetic hard drives and optical storage media such as CDs and DVDs.
As TR senior writer Gregory T. Huang explains in this issue’s cover story, “Holographic Memory,” a new type of memory called holographic storage is on the verge of commercialization and is likely to continue – and perhaps even accelerate – these impressive advances. A holographic storage system writes data onto a polymer disc in three dimensions, dramatically boosting its ability to pack in the bytes.
The success of holographic storage is not guaranteed, of course. Like any new technology in the marketplace of microelectronics, it will face plenty of competition, both from the continual improvements in existing technologies and from other new forms of memory. There are other optical discs in development that store 100 gigabytes each, and IBM’s experimental nanotech product Millipede has the potential to far surpass that capacity.
But for most of us, the question of which technology will prevail is not nearly as interesting as the question of what changes are coming as a result of this enormous boost to computer memory. Writer Huang explores some of the possibilities.
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.