Tay, the rogue chatbot
“helooooooo world!!!” One Wednesday in March, Microsoft unveiled Tay, a female chatbot with its own Twitter account. Microsoft described Tay as a “machine learning project, designed for human engagement” which would converse with 18- to 24-year-olds, learn from them, and get smarter with time. Within 24 hours, however, Tay was bragging about smoking drugs, asking for sex, and opining that “Hitler was right …” and “feminists should … burn in hell.” Peter Lee, head of Microsoft Research, decommissioned the chatbot the next day. “We are deeply sorry for the unintended offensive and hurtful tweets from Tay,” said Lee. “Tay is now offline.”
Capitalizing on machine learning with collaborative, structured enterprise tooling teams
Machine learning advances require an evolution of processes, tooling, and operations.
The Download: how to fight pandemics, and a top scientist turned-advisor
Plus: Humane's Ai Pin has been unveiled
The race to destroy PFAS, the forever chemicals
Scientists are showing these damaging compounds can be beat.
How scientists are being squeezed to take sides in the conflict between Israel and Palestine
Tensions over the war are flaring on social media—with real-life ramifications.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.