Skip to Content



Object storage for digital-age challenges

Once used primarily for archiving and backup, the technology is gaining traction for its mammoth capacity, processing power, and cybersecurity capabilities.

In association withHitachi Vantara

When Mastercard wanted to improve the speed and security of credit card transactions, when Baylor College of Medicine was scaling up its human genomic sequencing program, and when toymaker Spin Master was expanding into online video games and television shows, they all turned to object storage technology to facilitate the processing of massive amounts of data.

Object storage for digital-age challenges

Object storage, with its virtually infinite capacity and low cost, has a long history of being deployed for backup, archiving, disaster recovery, and regulatory compliance. But the demands of today’s data-centric organizations have brought the technology from the dusty storage closet to the center stage of digital transformation.

For any tech decision-maker thinking about an overall data strategy, having a large central repository, also known as a data lake, is the preferred approach—it helps break down silos and aggregate data from multiple sources for the type of data analysis that delivers value to the business. Object storage is the most effective underlying technology for applying data analytics, machine learning, and artificial intelligence to those vast data stores, says Scott Sinclair, storage analyst at market researcher Enterprise Strategy Group.

“The biggest advantage of object storage is to add more value to primary data. It doesn’t just store files; it adds context,” says Paul Schindeler, a former IDC analyst and currently CEO of the Dutch consultancy Data Matters. An object store includes metadata, or labels, which enables companies to easily search vast volumes of data, determine the origin of the data, whether it has been altered and, more important, to set policies and keep auditable records on who can see the file, who can open it, and who can download data.

Most organizations today use a mix of storage types: file storage, block storage, and object storage. But the use of object storage is surging for a number of reasons: speed, scalability, searchability, security, data integrity, reliability, and protection against ransomware. And it’s the wave of the future when it comes to big data analytics.

Object storage, then and now

Object storage was developed in the 1990s to handle data stores that were simply too large to be backed up with file and block storage, says Sinclair. When introduced, the almost infinite scalability, low cost, and immutability of object storage made it ideal for backup and recovery and long-term archiving and compliance with regulations such as the Health Insurance Portability and Accountability Act, in health care, and Sarbanes-Oxley, in banking.  

The next watershed event in the evolution of object storage was the ascendance of cloud storage. Cloud services vendor Amazon Web Services chose object storage architecture as the foundation for its popular Simple Storage Service (S3), and object storage has become the standard platform for all cloud storage, whether from Google, Microsoft, or others. In addition, S3 protocols have become the industry standard for modern data-centric applications, whether they run in the cloud or in a corporate data center.

More recently, organizations have come to the realization that they need to do more than just park and protect their data; they need to extract value from vast troves of historical data, as well as from new data sources and data types, such as internet-of-things sensor data, video, and images. That’s where object storage really shines. It has become the platform organizations are building their data analytics capabilities on to modernize their computing environments, create innovation, and drive digital transformation.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Deep Dive


Inside the hunt for new physics at the world’s largest particle collider

The Large Hadron Collider hasn’t seen any new particles since the discovery of the Higgs boson in 2012. Here’s what researchers are trying to do about it.

How ASML took over the chipmaking chessboard

MIT Technology Review sat down with outgoing CTO Martin van den Brink to talk about the company’s rise to dominance and the life and death of Moore’s Law.


How Wi-Fi sensing became usable tech

After a decade of obscurity, the technology is being used to track people’s movements.

Algorithms are everywhere

Three new books warn against turning into the person the algorithm thinks you are.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.