Talk about fast. Researchers have recently reported sending over 100 terabits of information per second through an optical fiber, New Scientist recently reported. That’s a staggering amount of data–it would take three months’ worth of HD video footage to use so much space.
The findings were revealed at the Optical Fiber Communications Conference, held in Los Angeles recently. First, an NEC Laboratories researcher (in Princeton, NJ) named Dayou Qian shared how he managed to push 101.7 terabits of data per second along 103 miles of fiber. The trick involved using pulses from 370 different lasers to multiply the amount of information that could be encoded at once. The light pulses were further varied to encode more information by using different polarities, phases, and amplitudes of light waves, according to reports.
Breakthroughs often occur in pairs (otherwise we wouldn’t have so many patent disputes). Not to be outdone, a researcher at the Japanese National Institute of Information and Communications Technology, Jan Sakaguchi, had an even more impressive figure to share. Sakaguchi managed to squeeze 109 terabits per second through a fiber. His technique was different, and a little more intuitive–he simply used seven light-guiding cores in his fiber, rather than the more traditional single core. “We introduced a new dimension, spatial multiplication, to increasing transmission capacity,” as he put it to New Scientist.
Does this mean your page-loading woes are forever dissipated? According to the report, the finding has little immediate bearing on your day-to-day Internet usage. The numbers involved here are so large, that they matter less to the individual consumer than they do to major data centers like those fueling giants like Google and Facebook (though presumably, any time saved there might ultimately benefit you in one way or another). Even looking on an infrastructural level, the numbers involved simply dwarf current commercial need. 100 terabits per second? One of today’s most heavily trafficked broadband channels, that between New York and DC, only needs to send over a handful of terabits per second–not anywhere near 100 of them. Still, the rise of video-streaming and other data intensive projects means it can’t hurt to have this tech in our back pocket, by any means. “Traffic has been growing about 50% per year for the last few years,” Tim Strong of Telegeography Research told New Scientist.
As more and more cities come online in serious, data-guzzling ways–as we enter what’s been termed the Terabit Age–it certainly won’t hurt to have hit what one NEC researcher dubbed this “critical milestone in fiber capacity.”
DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.
“This is a profound moment in the history of technology,” says Mustafa Suleyman.
What to know about this autumn’s covid vaccines
New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.
Human-plus-AI solutions mitigate security threats
With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure
Next slide, please: A brief history of the corporate presentation
From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.