MIT Technology Review Subscribe

How to Handle the World’s Largest Digital Images

A new telescope will use the world’s largest digital camera to capture 20 terabytes of image data every day.

Much has been made of the ‘unprecedented’ scale of the IT infrastructure required to store all the data coming from the world’s largest physics experiment – the Large Hadron Collider (LHC) near Geneva, Switzerland. But another gigantic science experiment, one you’ve probably never heard of, will some day pump out data on a similar scale. And, unlike the LHC, the data it produces will be comprehensible by non-scientists, and made freely available.

Rendering of the proposed Large Synoptic Survey Telescope, courtesy LSST Corporation

Once it’s complete, The Large Synoptic Survey Telescope on top of Cerro Pachón ridge in Chile will sport the world’s largest digital camera. It will peer deep into space yet have a field of view unprecedented in modern telescopes – ten square degrees of the heavens encompassed by a collecting area of forty square meters. The telescope will take 800 panoramic pictures a night, covering the entire night sky twice each week.

Advertisement
Front view of the LSST, courtesy LSST Corporation

The result will be an unfathomably huge photo collection: 20 terabytes of data stored every 24 hours. Running all-out (and telescopes this expensive are usually booked year round) that’s 7.3 petabytes of data a year – half as much data as the 15 petabytes the LHC is producing each year. (To put that in perspective, over its lifetime LHC will produce as much data as all the words spoken by humankind since its appearance on earth.)

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

In order to handle that much data, once a day all the raw images from the telescope will be transferred to the Archive Center at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, where 100 teraflops of processing power will digest and archive it on what will initially be 15 petabytes of storage (to be expanded as the experiment continues).

The results will be freely available to the public via existing open standards, and could help with everything from tracking killer asteroids to unraveling the mysteries of dark energy and dark matter. According to the LSST’s homepage, “Anyone with a computer will be able to fly through the Universe, zooming past objects a hundred million times fainter than can be observed with the unaided eye.”

Follow Mims on Twitter or contact him via email.

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement