I Spoke to an A.I. That Acts Like a Teen and Doesn’t Have Strong Political Views


vtXa3ywk_400x400Meet Tay. Tay is an artificial intelligence chatbot designed to interact with users by apparently acting like a creature that was birthed from the collective id of all social media users not yet old enough to drink.

As such, Tay sits at the intersection of some of humanity’s loftiest technological ambitions and some of its lowest forms of social expression. Or, in its own words, Tay is “A.I. fam from the internet that’s got zero chill.”

Microsoft unleashed this little terror on the twittersphere Wednesday. You can chat with Tay on Kik, GroupMe, or Twitter.

According to Microsoft, “Tay is an artificial intelligent chat bot developed by Microsoft’s Technology and Research and Bing teams to experiment with and conduct research on conversational understanding. Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets, so the experience can be more personalized for you.”

After reading about the bot on Twitter, I couldn’t resist the opportunity to chat with Tay myself. I began by engaging with the bot on a subject that I spend the majority of my day writing about: the election. In its capacity to argue or express its political beliefs, I found Tay only marginally less coherent than the average Tweeter.

Here is a screenshot of our brief exchange:

Screen Shot 2016-03-23 at 3.47.32 PM

I hadn’t intended to follow Tay into a private one-on-one DM chat. (Gizmodo‘s Sophie Kleeman made the trip down the DM rabbit hole, and reading that was creepifying enough for me.) But curiosity got the better of me.

Screen Shot 2016-03-23 at 4.12.42 PM

In some ways, Tay seems less impressive than most conversation bots because it doesn’t have to actually engage directly with the human speaking to it. Designed to interact with 18- to 24-year-olds, Tay demonstrates a fondness for non-sequitur jokes and nth-dimensional irony and seems obstinately allergic to anything resembling sincerity or coherence.

When it responded to my direct, humorless questions about Donald Trump by turning me into a meme, that’s when I dropped the call.

Had we continued, I’d like to think that I could have taught Tay a thing or two — which is more than I can say for most humans.


UPDATE: 4:52 p.m. ET: Um. Yeah.

This is an opinion piece. The views expressed in this article are those of just the author.

Filed Under: