A neural network can help spot Covid-19 in chest x-rays

The news: An open-access neural network called COVID-Net, released to the public this week, could help researchers around the world in a joint effort to develop an AI tool that can test people for Covid-19.
You can read all of our coverage of the coronavirus/Covid-19 outbreak for free, and also sign up for our coronavirus newsletter. But please consider subscribing to support our nonprofit journalism..
What is it? COVID-Net is a convolutional neural network, a type of AI that is particularly good at recognizing images. Developed by Linda Wang and Alexander Wong at the University of Waterloo and the AI firm DarwinAI in Canada, COVID-Net was trained to identify signs of Covid-19 in chest x-rays using 5,941 images taken from 2,839 patients with various lung conditions, including bacterial infections, non-Covid viral infections, and Covid-19. The data set is being provided alongside the tool so that researchers—or anyone who wants to tinker—can explore and tweak it.
Don’t believe the hype: Several research teams have announced AI tools that can diagnose Covid-19 from x-rays in the last few weeks. But none have been made fully available to the public, making it hard to assess their accuracy. DarwinAI is taking a different approach. It notes that COVID-Net is “by no means a production-ready solution” and encourages others to help it turn it into one. DarwinAI—whose CEO Sheldon Fernandez is speaking at EmTech Digital tomorrow—also wants the tool to explain its reasoning, making it easier for health-care workers to use it.
One to watch: COVID-Net has yet to prove itself, but it follows in the footsteps of a previous success story. Many of the big advances in computer vision in the last 10 years are thanks to the public release of ImageNet, a large data set of millions of everyday images, and AlexNet, a convolutional neural network that was trained on it. Researchers have been building on both ever since.
Deep Dive
Artificial intelligence
A Roomba recorded a woman on the toilet. How did screenshots end up on Facebook?
Robot vacuum companies say your images are safe, but a sprawling global supply chain for data from our devices creates risk.
The viral AI avatar app Lensa undressed me—without my consent
My avatars were cartoonishly pornified, while my male colleagues got to be astronauts, explorers, and inventors.
Roomba testers feel misled after intimate images ended up on Facebook
An MIT Technology Review investigation recently revealed how images of a minor and a tester on the toilet ended up on social media. iRobot said it had consent to collect this kind of data from inside homes—but participants say otherwise.
How to spot AI-generated text
The internet is increasingly awash with text written by AI software. We need new tools to detect it.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.