Talk about fast. Researchers have recently reported sending over 100 terabits of information per second through an optical fiber, New Scientist recently reported. That’s a staggering amount of data–it would take three months’ worth of HD video footage to use so much space.
The findings were revealed at the Optical Fiber Communications Conference, held in Los Angeles recently. First, an NEC Laboratories researcher (in Princeton, NJ) named Dayou Qian shared how he managed to push 101.7 terabits of data per second along 103 miles of fiber. The trick involved using pulses from 370 different lasers to multiply the amount of information that could be encoded at once. The light pulses were further varied to encode more information by using different polarities, phases, and amplitudes of light waves, according to reports.
Breakthroughs often occur in pairs (otherwise we wouldn’t have so many patent disputes). Not to be outdone, a researcher at the Japanese National Institute of Information and Communications Technology, Jan Sakaguchi, had an even more impressive figure to share. Sakaguchi managed to squeeze 109 terabits per second through a fiber. His technique was different, and a little more intuitive–he simply used seven light-guiding cores in his fiber, rather than the more traditional single core. “We introduced a new dimension, spatial multiplication, to increasing transmission capacity,” as he put it to New Scientist.
Does this mean your page-loading woes are forever dissipated? According to the report, the finding has little immediate bearing on your day-to-day Internet usage. The numbers involved here are so large, that they matter less to the individual consumer than they do to major data centers like those fueling giants like Google and Facebook (though presumably, any time saved there might ultimately benefit you in one way or another). Even looking on an infrastructural level, the numbers involved simply dwarf current commercial need. 100 terabits per second? One of today’s most heavily trafficked broadband channels, that between New York and DC, only needs to send over a handful of terabits per second–not anywhere near 100 of them. Still, the rise of video-streaming and other data intensive projects means it can’t hurt to have this tech in our back pocket, by any means. “Traffic has been growing about 50% per year for the last few years,” Tim Strong of Telegeography Research told New Scientist.
As more and more cities come online in serious, data-guzzling ways–as we enter what’s been termed the Terabit Age–it certainly won’t hurt to have hit what one NEC researcher dubbed this “critical milestone in fiber capacity.”
Embracing CX in the metaverse
More than just meeting customers where they are, the metaverse offers opportunities to transform customer experience.
Identity protection is key to metaverse innovation
As immersive experiences in the metaverse become more sophisticated, so does the threat landscape.
The modern enterprise imaging and data value chain
For both patients and providers, intelligent, interoperable, and open workflow solutions will make all the difference.
Scientists have created synthetic mouse embryos with developed brains
The stem-cell-derived embryos could shed new light on the earliest stages of human pregnancy.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.