Skip to Content
Uncategorized

Patchwork Computing

Researchers create intelligent–and machine washable!–fabrics.

Researchers at the Media Lab have created modular, computerized patches of fabric that can be pieced together to form items of clothing or accessories. These pieces of fabric can also provide different information and services depending on their configuration.

A purse assembled from the patches can tell when it’s dark outside and turn on an interior light, or it can inform its owner if her wallet is missing. The purse can also be torn apart and quickly reassembled into a scarf. In addition to keeping its wearer’s neck warm, the scarf can play music it has downloaded from the Internet via Bluetooth chips or report the amount of smog in the air. And as its creators, V. Michael Bove Jr. ‘83, SM ‘85, PhD ‘89, a principal research scientist in the Media Lab, and graduate student Gauri Nanda, are quick to point out, the patches are even machine washable. Each contains a variety of sensors and processors and communicates with its neighbors through metallic, Velcro-like edging.

Bove and Nanda expect that others will come up with more imaginative and useful shapes for their invention, which they hope will be ready for commercialization in as little as a year. In the meantime, they are working on enabling the patches to wirelessly download new functions from the Internet.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.