Skip to Content
Artificial intelligence

With this tool, AI could identify new malware as readily as it recognizes cats

A huge data set will help train algorithms to spot the nasty programs hiding in our computers.
April 18, 2018
Chameleon Design

From ransomware to botnets, malware takes seemingly endless forms, and it’s forever proliferating. Try as we might, the humans who would defend our computers from it are drowning in the onslaught, so they are turning to AI for help.

There’s just one problem: machine-learning tools need a lot of data. That’s fine for tasks like computer vision or natural-language processing, where large, open-source data sets are available to teach algorithms what a cat looks like, say, or how words relate to one another. In the world of malware, such a thing hasn’t existed—until now.

This week, the cybersecurity firm Endgame released a large, open-source data set called EMBER (for “Endgame Malware Benchmark for Research”). EMBER is a collection of more than a million representations of benign and malicious Windows-portable executable files, a format where malware often hides. A team at the company also released AI software that can be trained on the data set. The idea is that if AI is to become a potent weapon in the fight against malware, it needs to know what to look for.

Security firms have a sea of potential data to train their algorithms on, but that’s a mixed blessing. The bad actors who make malware are constantly tweaking their code in an effort to stay ahead of detection, so training on malware samples that are out of date could prove an exercise in futility.

“It’s a game of whack-a-mole,” says Charles Nicholas, a computer science professor at the University of Maryland, Baltimore County.

EMBER is meant to help automated cybersecurity programs keep up.

Instead of a collection of actual files, which could infect the computer of any researcher using them, EMBER contains a kind of avatar for each file, a digital representation that gives an algorithm an idea of the characteristics associated with benign or malicious files without exposing it to the genuine article. 

This should help those in the cybersecurity community quickly train and test out more algorithms, enabling them to construct better and more adaptable malware-hunting AI.

Of course, making the data set open for anyone to use could also prove a liability if it were to fall into the wrong hands. Malware creators could use the data to design systems that virus-hunting AI won’t recognize, a problem that Hyrum Anderson, Endgame’s technical director of data science, says the company has thought through. Anderson, who worked on EMBER, says that he hopes the benefits of this openness outweigh the risks. Besides, cybercrime is so lucrative that the people behind malware are already well motivated to keep refining their attack tools.

“The hacker will find an example anyway,” says Gerald Friedland, a computer science professor at the University of California, Berkeley.

Deep Dive

Artificial intelligence

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

Providing the right products at the right time with machine learning

Amid shifting customer needs, CPG enterprises look to machine learning to bolster their data strategy, says global head of MLOps and platforms at Kraft Heinz Company, Jorge Balestra.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.