Hello,

We noticed you're browsing in private or incognito mode.

To continue reading this article, please exit incognito mode or log in.

Not a subscriber? Subscribe now for unlimited access to online articles.

A spoof if the "It" movie poster reading "It Christmas".
  • Mr. Tech
  • Intelligent Machines

    We tried teaching an AI to write Christmas movie plots. Hilarity ensued. Eventually.

    Using a neural network to create ridiculous plot lines takes a lot of work—and reveals the challenges of generating human language.

    Perhaps more than anyone, research scientist Janelle Shane popularized AI humor. You may have seen her work. During Halloween this year, she partnered with the New York Times to generate costume names with a neural network. The results—and illustrations—are pretty fantastic: “Sexy Minecraft Person” and “Vampire Chick Shark,” to name a few.

    Shane has made an art out of AI-generated comedy on her delightful blog AI Weirdness. She’s fed neural networks everything from cocktail recipes and pie names to horror movie titles and Disney song lyrics. The results they churn out are always hilarious.

    Inspired, senior AI editor Will Knight and I embarked on a challenge to create a comedic masterpiece in the style of Shane. So we fed plot summaries of 360 Christmas movies, courtesy of Wikipedia, into a machine-learning algorithm to see if we could get it to spit out the next big holiday blockbuster. Suffice it to say I now empathize with researchers who describe training neural nets as more of an art than a science. As I also discovered, getting them to be funny is actually pretty damn hard.

    Follow our process below:

    The process
    The moral
    The results

    The process

    The algorithm we used is called textgenrnn, the same one Shane used in her collaboration with the Times. Textgenrnn has two modes: you can either use letters to generate words in the style of other words, or use words to generate sentences in the style of other sentences.

    Each mode comes with the same settings, which you can tune in various attempts to coax out good results. I primarily focused on three of the settings: the number of layers, the number of epochs, and the temperature.

    A spoof of the "Fight Club" movie poster reading "Fight Christmas" and showing Santa holding a bar of soap.
    We also tried generating Christmas movie titles. Our art department had a lot of fun with these.
    Mr. Tech

    Allow me to explain. Layers, here, refers to the complexity of the neural network: the more layers it has, the more complicated the data it can handle. The number of epochs is the number of times it gets to look at the training data before spitting out its final results. And the temperature is like a creativity setting: the lower the temperature, the more the network will choose common words in the training data set versus those that rarely appear.

    Of course, I knew none of this while I was futzing around. Instead, I uploaded my movie plots in a text file and began blindly tweaking the knobs. Here are some examples of what I began to see, with high creativity settings and a growing number of epochs:

    Epoch 4
    Mary must friends from magic putting christmas stop nathan endeavors the.

    Rubs serious a resort bet elves cared the a in day tallen shady with christmas unveiling retrieve died california awaits is groundhog after back of the wise janitor christmas traumatized the to to discover popular to his community.

    A survive before show in town they the.

    Epoch 6
    And boy son working whose issues born religious can a.

    Max mccallister evie to who true to her christmas the in partner question.

    Orphan and mccallister apartment thief most holiday.

    Epoch 8
    Suburban owner away team short evil to at that she the naughty attempt naughty into of escape life learns neighborhood were their house circumstances visit you to.

    WWII find retriever to to to the friends for.

    A mother couple a a takes pacifist three cheap family tells cozy presents clone to toothless.

    If you’re reading these and thinking, Those are incomprehensible and not funny, then you, friend, and I are of the same mind.

    At first I assumed I was doing something wrong; I hadn’t quite cracked the art of training the neural network correctly. But after dozens of attempts on different settings, I finally came to the conclusion that this is as good as it gets. Most of the sentences will be flat-out terrible, and on the rare occasion, you will get a gem.

    The moral

    Part of the problem, Shane explained when I spoke to her, was due to my small training data set—360 data points is small potatoes compared with the millions typically used for high-quality results. Part of it was also due to textgenrnn—the algorithm, she said, just isn’t that good at constructing sentences compared with alternatives. (“Do we know why?” I asked Shane. “I don’t think even the guy who made textgenrnn really knows,” she said. “It could even be a bug.” Ah, the beauty of black box algorithms.)

    But the main reason is really the limitations of generating sentences with a neural net. Even if I’d used better data and a better algorithm, the challenge of achieving coherence is exceedingly normal.

    A spoof of "The Green Mile" movie poster reading "The Christmas Mile" showing the main character wearing a santa hat.
    Mz. Tech

    This makes sense if you think about what’s happening under the hood. Machine-learning algorithms are really good at using statistics to find and apply patterns in data. But that’s about it. So in the context of constructing sentences, you’re choosing each consecutive word based only on the probability that it would appear after the previous word. It’s like trying to compose an e-mail with predictive text. The result would be riddled with non sequiturs, switches between singular and plural, and a whole lot of confusion over parts of speech.

    So really, it takes a lot of manual labor to make a neural network spit out gibberish that humans would consider remotely humorous.

    “For some data sets, I’m only showing people maybe one out of a hundred things it generates,” Shane admitted. “I’m doing really well if one out of ten is actually funny and worth showing.” In many instances, she continued, it takes her more time to curate the results than to train the algorithm.

    Lesson learned: neural networks aren’t that funny. It’s the humans who are.

    The results

    As a bonus, here are some of the best algorithmically generated plot summaries Will and I were able to come up with, slightly cleaned. We also generated Christmas movie titles using word mode for good measure. And, because we couldn’t resist, we added just a dash of commentary.

    Synopses
    A family of the Christmas terrorist1 and offering the first time to be a charlichhold for a new town to fight.
    A story of home-life father of the Christmas story.
    The reclusive from Christmas2.
    A woman from chaos3 adopted home believes.
    A princess ogre nearby cross by on the Christmas.
    A gardener detective but country murderer4 magical suddenly Christmas the near elf.
    A intercepting suffers and a friends up change Christmas with his and save Christmas time.
    A family man and a special estranged for Christmas.
    A stranded on Christmas Eve to the New York family before Christmas.
    Santa5.
    The Scrooge-leads Bad by Santa6, since Anima.
    A man returns to the singer who is forced to return his life7 with a couple to help her daughter for Christmas.
    An angel of Santa’s hitch from the plant.
    Lonely courier village newspaper by home destroy Christmas Christmas Christmas8 the prancer.
    Babysitter boy tries to party the Christmas in of for more Christmas.

    Titles
    The Christmas Store
    Santa Christmas Christmas
    The Christmas Mile9
    Fight Christmas
    The Nighht Claus10
    I Santa Manta Christmas Porie
    Babee Christmas
    A Christmas StorK11
    The Grange Christmas
    The Santa Christmas Pastie Christmas
    Christmas Caper12
    A Christmas Mister
    The Lick Christmas
    Mrack Me Christmas Satra13
    The Christmas Catond 214
    Santa Bach Christmas
    Christmas Pinta
    Christmas Cast
    A Christmas to Come15
    It Santa
    Fromilly16

    Keep up with the latest in AI at EmTech Digital.

    The Countdown has begun.
    March 25-26, 2019
    San Francisco, CA

    Register now
    A spoof if the "It" movie poster reading "It Christmas".
    A spoof of the "Fight Club" movie poster reading "Fight Christmas" and showing Santa holding a bar of soap.
    We also tried generating Christmas movie titles. Our art department had a lot of fun with these.
    Mr. Tech
    A spoof of "The Green Mile" movie poster reading "The Christmas Mile" showing the main character wearing a santa hat.
    Mz. Tech
    More from Intelligent Machines

    Artificial intelligence and robots are transforming how we work and live.

    Want more award-winning journalism? Subscribe to Print + All Access Digital.
    • Print + All Access Digital {! insider.prices.print_digital !}*

      {! insider.display.menuOptionsLabel !}

      The best of MIT Technology Review in print and online, plus unlimited access to our online archive, an ad-free web experience, discounts to MIT Technology Review events, and The Download delivered to your email in-box each weekday.

      See details+

      12-month subscription

      Unlimited access to all our daily online news and feature stories

      6 bi-monthly issues of print + digital magazine

      10% discount to MIT Technology Review events

      Access to entire PDF magazine archive dating back to 1899

      Ad-free website experience

      The Download: newsletter delivery each weekday to your inbox

      The MIT Technology Review App

    /3
    You've read of three free articles this month. for unlimited online access. You've read of three free articles this month. for unlimited online access. This is your last free article this month. for unlimited online access. You've read all your free articles this month. for unlimited online access. You've read of three free articles this month. for more, or for unlimited online access. for two more free articles, or for unlimited online access.