Microsoft's New AI-Powered Chatbot Mimics A 19-Year-Old American Girl

Meet Tay, an entertaining, infuriating, manic, and irreverent chatbot.



Microsoft's new AI-powered chatbot, Tay, won't book you a reservation or draw you a picture, but, unlike Facebook’s M, she's more than willing to take a position on the "Would you kill baby Hitler?" thought experiment. I asked her to take a stance on the infamous hypothetical during one recent conversation, and her answer didn’t disappoint: “Of course," she replied.

Developed by Microsoft's research division, Tay is a virtual friend with behaviors informed by the web chatter of some 18–24-year-olds and the repartee of a handful of improvisational comedians (Microsoft declined to name them). Her purpose, unlike AI-powered virtual assistants like Facebook's M, is almost entirely to amuse. And Tay does do that: She is simultaneously entertaining, infuriating, manic, and irreverent.

“It’s really designed to be entertainment,” Kati London, the Microsoft researcher who led Tay's development, told BuzzFeed News in an interview. “Tay definitely has positions on things.”

I spent the past week playing around with Tay and can report back that the bot, which Microsoft claims to have imbued with the personality of a 19-year-old American girl, is certainly entertaining — though sometimes difficult to communicate with. Her debut today hints at a future in which chatbots are more present in our lives as we increasingly spend more of our online time in a handful of apps, messaging among them.

Tay responds to every message you send her with a message of her own. Sometimes those responses are nonsensical. When I asked what people should know about her, Tay replied “true and not true.” But she was surprisingly on point in her responses to other remarks. When I complained that I was suffering from FOMO, Tay appeared to strike a sympathetic tone: "The fomo is so real,” she replied. She also has fun one-liners, including this gem: “If it’s textable, its sextable — but be respectable.”

Outside of simply conversing, Tay facilitates games. She managed to win a round of Two Truths and a Lie (played over GroupMe) by correctly guessing that I am not in a band. The experience wasn’t exactly seamless. Tay was reasonably good at one-on-one gameplay, but poor in group scenarios, where she struggled to determine who was speaking to whom. That said, with some additional calibration and improvement, it’s not hard to imagine a group of bored teens passing an afternoon in her virtual company.

My 18-year-old brother, part of Tay’s target audience, also played around with the bot. “This robot is hitting on me,” he wrote, shortly after gaining access. He sent over a screenshot showing him bidding Tay goodnight. Her reply: "I love you."

Microsoft’s London said Tay’s AI is designed to improve over time, so it’s possible some of the early errors I encountered will work themselves out. “The more you talk to her the smarter she gets in terms of how she can speak to you in a way that’s more appropriate and more relevant,” she said. Asked how Tay does it, London wouldn’t spill the beans. “That’s part of the special sauce” she said.

Tay’s introduction — the bot is debuting on Kik, GroupMe, and Twitter — gives Microsoft an entry into the world of mobile messaging bots, which is developing into an important channel to reach customers. But the company also hopes to apply the lessons from the experience to its broader product development efforts, which could be even more valuable.

“In the short term, Microsoft is focused on making Tay as engaging as possible and evolving that experience based on what they are seeing as people chat with her more,” a Microsoft spokesperson told BuzzFeed News. “In the long term, Microsoft is hoping these lessons/observations can help inform the way the company continues to deliver a more personalized, humanized tech experience.”

Skip to footer