A View from Jessica Leber
Will Big Data Get Too Big for the Metric System to Handle?
It’s dizzying to contemplate, but it might not be long before the volume of digital data surpasses the current limit of measures.
In 1991, the General Conference on Weights and Measures met to add a few prefixes to the metric system to deal with the very large and very small scales of measurement that scientific advances required. The largest they came up with is the “yotta,” a number that contains 24 zeroes. As in: the diameter of the observable universe is estimated to be 880 “yottameters.”
“Big data” sometimes feels like a buzzword, but it gets more concrete when you imagine that soon the volume of digital data processed could surpass this current upper bound, which only two decades ago was the limits of scientists’ imaginations.
That’s at least the prediction of Andrew McAfee, who is principal research scientist at MIT’s Center for Digital Business and a prominent thinker about business information technology trends (see “When Machines Do Your Job”). At a conference I attended, and on his blog, McAfee chronicles the “arms race” of organizations declaring first the era of the “terrabyte,” then the “petabyte,” and most recently, Cisco’s call for the “zettabyte” era, as measured by its forecast of annual global IP traffic in 2016.
“Yotta” comes next, and McAfee predicts the global measurement body will be contemplating its successor by the time the decade is up. His favorite contender for a new prefix? The “hella.” As a San Francisco resident, I support the idea.