Skip to Content

DeepMind Will Use AI to Streamline Targeted Cancer Treatment

Machine learning could cut the time it takes to plan a patient’s radiotherapy treatment by hours.
August 31, 2016

Working out how to zap a tumor with radiation is a laborious process for physicians. Google’s machine-learning division, DeepMind, thinks AI can help ease the burden.

When medics apply radiotherapy to a cancer patient, they have to carefully determine which parts of the body should be exposed to radiation in order to kill the tumor while ensuring that as much healthy surrounding tissue as possible is preserved. The process, known as segmentation, requires the doctor to manually draw areas that can and can’t be treated onto a 3-D scan of the patient’s tumor site. The process is particularly complex for head and neck cancers, in which the tumor often sits immediately next to many important anatomical features.

Now, though, DeepMind will work with University College Hospital in London to develop an artificial-intelligence system that can automate the process. DeepMind will analyze 700 anonymized scans from former patients who suffered from head and neck cancers. They hope to create an algorithm that can learn how physicians make decisions about this part of the treatment process, ultimately segmenting the scans automatically.

“Clinicians will remain responsible for deciding radiotherapy treatment plans, but it is hoped that the segmentation process could be reduced from up to four hours to around an hour,” explains DeepMind.

In time, the DeepMind team hopes, the same algorithm might find application in treating cancers elsewhere in the body.

IBM’s Watson supercomputer has also been applying machine learning to personalized cancer treatment, though its approach is a little more bookish. It’s currently drawing on 600,000 medical evidence reports and 1.5 million patient records and clinical trials to help doctors develop better treatment plans for cancer patients.

This isn’t DeepMind’s first foray into medical research, either—in fact, this is the third project that it’s announced in collaboration with the U.K.’s National Health Service. After coming under fire earlier in the year when an app project appeared to provide DeepMind with free access to 1.6 million patients’ records, the research outfit recently announced that it was helping to spot the early signs of visual degeneration by sifting through a million eye scans.

Perhaps it’s working its way down the body.

(Read more: DeepMind, “DeepMind’s First Medical Research Gig Will Use AI to Diagnose Eye Disease”)

Keep Reading

Most Popular

Scientists are finding signals of long covid in blood. They could lead to new treatments.

Faults in a certain part of the immune system might be at the root of some long covid cases, new research suggests.

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.