Skip to Content
Uncategorized

The Dizzying Data Rate Conundrum

The suffocating rate at which data is being produced in many experiments raises the question of how to store it for future generations.

When it is switched on later this year, the Large Hadron Collider (LHC) at CERN will smash particles together at the rate of 40 million collisions per second. It’s a process that will generate several petabytes of data per year, and one that the LHC has been set up specifically to handle. The data will be kicked, prodded, and crunched before being analyzed and eventually released into the community as a scientific paper for publication.

That’s its short-term fate. The question is what to do with the data in the long term. Should it be archived somewhere and kept for eternity, and if so, how (and why)?

The LHC is emblematic of a broader problem in science, say André Holzner and his buddies at CERN. That problem is a rapidly growing body of data from increasingly sophisticated experiments. Holzner and co say that an increasingly pressing problem is to understand how this data is being kept in disparate facilities around the world, so that future repositories can be designed to do the job in future.

With that in mind, and with generous funding from the European Union, they’ve questioned over 1,000 high-energy physicists linked to CERN about these questions and published the results on the arXiv.

There seems to be general agreement that data preservation is hugely important. But strangely, there is less agreement over what sort of data should be stored–for example, whether to preserve the raw data itself or some higher-level analysis of it. Stranger still is the broad range of opinion over why the data should be kept at all. Only 60 percent of respondents think that the data should be kept so that conclusions can be checked in future.

Clearly, the broad concern over the issue is matched only by the widespread befuddlement over what to do about it.

Which spells bad news for CERN and other data producers. CERN is about to switch on one of the greatest data fire hoses the world has ever seen. If there is to be any multilateral agreement over what to do in the long term with the data it and other projects produce, the discussions need to be settled sooner rather than later.

Ref: arxiv.org/abs/0906.0485: First results from the PARSE.Insight project: HEP survey on data preservation, re-use and (open) access

Deep Dive

Uncategorized

Our best illustrations of 2022

Our artists’ thought-provoking, playful creations bring our stories to life, often saying more with an image than words ever could.

How CRISPR is making farmed animals bigger, stronger, and healthier

These gene-edited fish, pigs, and other animals could soon be on the menu.

The Download: the Saudi sci-fi megacity, and sleeping babies’ brains

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology. These exclusive satellite images show Saudi Arabia’s sci-fi megacity is well underway In early 2021, Crown Prince Mohammed bin Salman of Saudi Arabia announced The Line: a “civilizational revolution” that would house up…

10 Breakthrough Technologies 2023

Every year, we pick the 10 technologies that matter the most right now. We look for advances that will have a big impact on our lives and break down why they matter.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.