Hello,

We noticed you're browsing in private or incognito mode.

To continue reading this article, please exit incognito mode or log in.

Not a subscriber? Subscribe now for unlimited access to online articles.

  • Jeremy Portje
  • Intelligent Machines

    Microsoft’s neo-Nazi sexbot was a great lesson for makers of AI assistants

    Yandex’s head of machine intelligence says Microsoft’s Tay showed how important it is to fix AI problems fast.

    Remember Tay, the chatbot Microsoft unleashed on Twitter and other social platforms two years ago that quickly turned into a racist, sex-crazed neo-Nazi?

    What started out as an entertaining social experiment—get regular people to talk to a chatbot so it could learn while they, hopefully, had fun—became a nightmare for Tay’s creators. Users soon figured out how to make Tay say awful things. Microsoft took the chatbot offline after less than a day.

    Yet Misha Bilenko, head of machine intelligence and research at Russian tech giant Yandex, thinks it was a boon to the field of AI helpers.

    Speaking at MIT Technology Review’s annual EmTech Digital conference in San Francisco on Tuesday, Bilenko said Tay’s bugs—like the bot’s vulnerability to being gamed into learning or repeating offensive phrases—taught great lessons about what can go wrong.

    The way Tay rapidly morphed from a fun-loving bot (she was trained to have the personality of a facetious 19-year-old) into an AI monster, he said, showed how important it is to be able to fix problems quickly, which is not easy to do. And it also illustrated how much people tend to anthropomorphize AI, believing that it has deep-seated beliefs rather than seeing it as a statistical machine.

    “Microsoft took the flak for it, but looking back, it’s a really useful case study,” he said.

    Chatbots and intelligent assistants have changed considerably since 2016; they’re a lot more popular now, they’re available everywhere from smartphone apps to smart speakers, and they’re getting increasingly capable. But they’re still not great at one of the things Tay was trying to do, which is show off a personality and generate chitchat.

    Bilenko doesn’t expect this to change soon—at least, not in the next five years. The conversations humans have are “very difficult,” he said.

     

    Learn from the humans leading the way in intelligent machines at EmTech Next. Register Today!
    June 11-12, 2019
    Cambridge, MA

    Register now
    More from Intelligent Machines

    Artificial intelligence and robots are transforming how we work and live.

    Want more award-winning journalism? Subscribe to Print + All Access Digital.
    • Print + All Access Digital {! insider.prices.print_digital !}*

      {! insider.display.menuOptionsLabel !}

      The best of MIT Technology Review in print and online, plus unlimited access to our online archive, an ad-free web experience, discounts to MIT Technology Review events, and The Download delivered to your email in-box each weekday.

      See details+

      12-month subscription

      Unlimited access to all our daily online news and feature stories

      6 bi-monthly issues of print + digital magazine

      10% discount to MIT Technology Review events

      Access to entire PDF magazine archive dating back to 1899

      Ad-free website experience

      The Download: newsletter delivery each weekday to your inbox

      The MIT Technology Review App

    /3
    You've read of three free articles this month. for unlimited online access. You've read of three free articles this month. for unlimited online access. This is your last free article this month. for unlimited online access. You've read all your free articles this month. for unlimited online access. You've read of three free articles this month. for more, or for unlimited online access. for two more free articles, or for unlimited online access.