Textiles and the fibers that compose them are experiencing a sort of high-tech renaissance lately. Researchers are finding ways to turn silk into sensors by adding biological molecules to it, and turn cotton sheets into electronic fabric by bathing them in a solution of nanotubes. The idea is to use the electronic textiles, which are flexible and can be worn comfortably, to sense such things as the blood of a soldier or pathogens circulating in the air.
Now researchers at MIT have integrated a collection of light sensors into polymer fibers, creating a new type of camera. Yoel Fink, a professor of materials sciences and engineering and the lead researcher on the project, notes that a standard camera requires lenses that are usually rigid and heavy. A camera made from fibers, however, could be lightweight, robust, and even foldable. Although Fink admits that the applications aren’t yet well defined, he suggests that such a fiber-based camera could be used in a large foldable telescope or integrated into soldiers’ uniforms.
Previously, Fink’s team has shown that it’s possible to integrate semiconducting materials into fibers and create long and flexible sensors for temperature or light that can be woven into varying shapes and sizes. In the researchers’ most recent work, they integrate eight sensors into a polymer fiber–more than ever before.
In order to make the camera, the researchers integrated the eight semiconducting light sensors into a polymer cylinder with a diameter of 25 millimeters, controlling the sensor’s spacing and angle within the fiber. Once the sensors, made of a type of semiconducting glass, were in position, the polymer cylinder was heated and then stretched so that the diameter shrank the diameter of hundreds of micrometers–a process that is identical to the way in which commercial fiber is made for telecommunication applications–retaining the orientation of the sensors.
Fabien Sorin, the postdoctoral researcher who developed the fiber camera, says that he made a 36-by-36 grid of fibers and connected the fiber’s semiconducting sensors to electrodes. When light hits the semiconductors, it displaces electrons within the material, creating an electrical current. The intensity of this current from the fibers is input into algorithms, running on an attached computer, that create the image of an object placed near the sheet of fiber.
The eight sensors are grouped in pairs consisting of an inner and outer sensor, Sorin says. “If you know the thickness of the first layer, and you know the type of material, then you can reconstruct the energy of the photon because this energy is directly related to how deep a photon can penetrate into a material.” In other words, the inner sensor provides information that lets the researchers find the energy, which corresponds to the wavelength, or color, of light.