Tay, the rogue chatbot
“helooooooo world!!!” One Wednesday in March, Microsoft unveiled Tay, a female chatbot with its own Twitter account. Microsoft described Tay as a “machine learning project, designed for human engagement” which would converse with 18- to 24-year-olds, learn from them, and get smarter with time. Within 24 hours, however, Tay was bragging about smoking drugs, asking for sex, and opining that “Hitler was right …” and “feminists should … burn in hell.” Peter Lee, head of Microsoft Research, decommissioned the chatbot the next day. “We are deeply sorry for the unintended offensive and hurtful tweets from Tay,” said Lee. “Tay is now offline.”
It will soon be easy for self-driving cars to hide in plain sight. We shouldn’t let them.
If they ever hit our roads for real, other drivers need to know exactly what they are.
Maximize business value with data-driven strategies
Every organization is now collecting data, but few are truly data driven. Here are five ways data can transform your business.
Cryptocurrency fuels new business opportunities
As adoption of digital assets accelerates, companies are investing in innovative products and services.
Where to get abortion pills and how to use them
New US restrictions could turn abortion into do-it-yourself medicine, but there might be legal risks.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.