Jump to content

Recommended Posts

  • FPCH Admin
Posted

tay.jpg

 

 

She was supposed to come off as a normal teenage girl. But less than a day after her debut on Twitter, Microsoft's chatbot—an AI system called "Tay.ai"—unexpectedly turned into a Hitler-loving, feminist-bashing troll. So what went wrong? TechRepublic turns to the AI experts for insight into what happened and how we can learn from it.

 

Tay, the creation of Microsoft's Technology and Research and Bing teams, was an experiment aimed at learning through conversations. She was targeted at American 18 to 24-year olds—primary social media users, according to Microsoft—and "designed to engage and entertain people where they connect with each other online through casual and playful conversation."

 

 

And in less than 24 hours after her arrival on Twitter, Tay gained more than 50,000 followers, and produced nearly 100,000 tweets.

 

 

The problem? She started mimicking her followers.

 

Soon, Tay began saying things like "Hitler was right i hate the jews," and "i fucking hate feminists."

 

But Tay's bad behavior, it's been noted, should come as no big surprise.

 

 

"This was to be expected," said Roman Yampolskiy, head of the CyberSecurity lab at the University of Louisville, who has published a paper on the subject of pathways to dangerous AI. "The system is designed to learn from its users, so it will become a reflection of their behavior," he said. "One needs to explicitly teach a system about what is not appropriate, like we do with children."

 

It's been observed before, he pointed out, in IBM Watson—who once exhibited its own inappropriate behavior in the form of swearing after learning the Urban Dictionary.

 

 

"Any AI system learning from bad examples could end up socially inappropriate," Yampolskiy said, "like a human raised by wolves."

 

Louis Rosenberg, the founder of Unanimous AI, said that "like all chat bots, Tay has no idea what it's saying...it has no idea if it's saying something offensive, or nonsensical, or profound.

 

"When Tay started training on patterns that were input by trolls online, it started using those patterns," said Rosenberg. "This is really no different than a parrot in a seedy bar picking up bad words and repeating them back without knowing what they really mean."

 

Sarah Austin, CEO and Founder Broad Listening, a company that's created an "Artificial Emotional Intelligence Engine," (AEI), thinks that Microsoft could have done a better job by using better tools. "If Microsoft had been using the Broad Listening AEI, they would have given the bot a personality that wasn't racist or addicted to sex!"

 

It's not the first time Microsoft has created a teen-girl AI. Xiaoice, who emerged in 2014, was an assistant-type bot, used mainly on the Chinese social networks WeChat and Weibo.

 

Joanne Pransky, the self-dubbed "robot psychiatrist," joked with TechRepublic that "poor Tay needs a Robotic Psychiatrist! Or at least Microsoft does."

 

The failure of Tay, she believes, is inevitable, and will help produce insight that can improve the AI system.

 

After taking Tay offline, Microsoft announced it would be "making adjustments."

 

According to Microsoft, Tay is "as much a social and cultural experiment, as it is technical." But instead of shouldering the blame for Tay's unraveling, Microsoft targeted the users: "we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways."

 

Yampolskiy said that the problem encountered with Tay "will continue to happen."

 

"Microsoft will try it again—the fun is just beginning!"

 

Source: techrepublic

~I know that you believe you understand what you think I said, but I'm not sure you realize that what you heard is not what I meant.~

~~Robert McCloskey~~

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...