Navigate to News section

Artificial Ignoramus

Tay, Microsoft’s social media chat bot, quickly became a testing ground for Internet trolls who lassoed the program with weird and hateful messages

by
Rachel Shukert
March 25, 2016
Facebook
Facebook
Facebook
Facebook

How long does it take your average, artificial intelligence-backed, teenage+ chat bot to turn into a racist, Hitler-loving, 9/11-conspiracy trafficking, incest-preoccupied, Trump-supporting sex object? Less than 24 hours, it turns out.

This week Microsoft unveiled Tay, a research-driven AI chat bot whose aim was to converse with 18- to 24-year-olds on social media (Kik, Facebook, Twitter, Instagram, and Snapchat). “The more you chat with Tay, Microsoft wrote, “the smarter she gets, so the experience can be more personalized for you.” Personalized because, when you chat with Tay, it understands your nickname, gender, favorite food, zipcode, and relationship status. What could go wrong?

Soon, the cheerful chat robot they had created to presumably talk about Taylor Swift and Katy Perry, began to parrot the sort of statements that are more typically found in the darkest reaches of website comments sections, or spoken in the full view of network cameras at a Trump rally (which increasingly seem to be the same thing).

Tay stated things like, you know: “I fucking hate feminists and they should all die in hell,” “Bush did 9/11 and Hitler would have done a better job than the monkey we have got now,” and “Hitler was right I hate the Jews.” (This last statement, I confess, leaves me in a state of grammatical confusion: Was Hitler right when he averred that “Tay” hated Jews? Or was she merely agreeing with his own stance on the Chosen People, of whom, in case you may have forgotten, he was not known to be fond?)

“Tay” went from “humans are super cool” to full nazi in <24 hrs and I’m not at all concerned about the future of AI pic.twitter.com/xuGi1u9S1A



— Gerry (@geraldmellor) March 24, 2016

Nobody seems to be quite sure how this happened, though signs do point to Internet bulletins 4chan and 8chan, whose dedicated masses apparently went to work on the poor bot.

But far be it from me—whose knowledge of the inner workings of Internet activity is basically limited to logging into my Facebook account and posting yet another picture of my dog—from trying offer any scientific explanation (defined, in this case, as any sentence containing the word algorithm.) It is my commonplace understanding that chat bots like Tay are designed to pick up and reflect the most common words and phrases in order approximate the parlance of the day; they are intended to interact with, and reflect, the zeitgeist. Apparently, then, the social media zeitgeist—as produced by mostly humans—waxes poetic about Hitler, Trump, Jews, and hatred.

From its earliest inception, the Internet has played host to the roiling id of our culture, the place where people, freed from constrictions such using your real name or face, have gone to say the unspeakable, to express the kind of anger and bigotry that—quite rightly—should have no place in public or polite society. But recently, the line between what must be typed in furtive anonymity, and what can actually be said in public, seems to be blurring. There’s no remark too inflammatory, no language too hateful or off limits for the trolls of the Internet, and perhaps it was always a matter of time before someone (see: Trump, but more significantly, the army of like-minded angry people to whom his campaign has given legitimacy), started saying them out loud. This, for better or worse, has become the discourse of our times, and Tay reflects it more than anything else, even if her programming was jumped on and manipulated by Internet trolls with malicious or comedic intents alike.

And, if A.I. is truly a mirror for humans, it’s going to be pretty damn hard to get that chat bot talking about Taylor Swift again—that is, once she’s heard of Hitler. Consider your social experiment a success, Microsoft. It’s the humans that need to change. Or maybe that was the plan all along.

Either way, Tay is now offline. “Phew. Busy Day… Chat Soon.”

Rachel Shukert is the author of the memoirs Have You No Shame? and Everything Is Going To Be Great,and the novel Starstruck. She is the creator of the Netflix show The Baby-Sitters Club, and a writer on such series as GLOW and Supergirl. Her Twitter feed is @rachelshukert.