Skip to Content

Better Brain Imaging Could Show Computers a Smarter Way to Learn

Using cutting-edge imaging to study the inner workings of our brains could lead to more powerful and useful machine-learning algorithms.
February 4, 2016

Machine learning is an extremely clever approach to computer programming. Instead of a having to carefully write out instructions for a particular task, you just feed millions of examples into a very powerful computer and, essentially, let it write the program itself.

Many of the gadgets and online services we take for granted today, like Web search, voice recognition, and image tagging, make use of some form of machine learning. And companies that have oodles of user data (Google, Facebook, Apple, Walmart, etc.) are nicely placed to ride this trend to riches.

A new $12 million dollar project at Carnegie Mellon University could make machine learning even more powerful by uncovering ways to teach computers more efficiently while using much less data.

The five-year effort will use a newish technique, called 2-photon calcium imaging microscopy, to study the way visual information is processed in the brain. The funding comes through President Obama’s BRAIN Initiative, and it is a good example of one of the near-term benefits that powerful new brain imaging techniques could have.

Many of the best machine-learning algorithms are, in fact, already loosely based on the functioning of the brain. But these are incredibly crude, and do not account for some simple features of biological networks.

Sandra Kuhlman, a professor at CMU, used fluorescent imaging to capture individual brain cells (identified with arrows).

“Powerful as they are, [these algorithms] aren’t nearly as efficient or powerful as those used by the human brain,” said Tai Sing Lee, a professor of computer science at CMU who is leading the effort. “For instance, to learn to recognize an object, a computer might need to be shown thousands of labeled examples and taught in a supervised manner, while a person would require only a handful and might not need supervision.”

Lee will collaborate with Sandra Kuhlman, a professor of biological sciences, also at CMU, and Alan Yuille, a professor of cognitive sciences at Johns Hopkins University.

It isn’t just neuroscience that could help us develop better machine-learning approaches. Some cognitive scientists are taking inspiration from psychology observations in order to build clever new learning systems.

Keep Reading

Most Popular

10 Breakthrough Technologies 2024

Every year, we look for promising technologies poised to have a real impact on the world. Here are the advances that we think matter most right now.

Scientists are finding signals of long covid in blood. They could lead to new treatments.

Faults in a certain part of the immune system might be at the root of some long covid cases, new research suggests.

AI for everything: 10 Breakthrough Technologies 2024

Generative AI tools like ChatGPT reached mass adoption in record time, and reset the course of an entire industry.

What’s next for AI in 2024

Our writers look at the four hot trends to watch out for this year

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.