Microsoft’s snarky Twitter bot Tay.AI currently offline after just one day

One day after Microsoft launched Tay, a chatbot designed to engage and entertain people, the service has gone offline.

While Tay was originally aimed to research human interaction and speech models, the service suddenly started sending out racist, homophobic, and nonsensical tweets.  This was caused by “trolls” on Twitter who would misuse Tay’s “repeat after me” service by tweeting Microsoft’s newest bot with racist and derogative remarks.

So, with Tay being an artificial intelligence robot, it apparently inadvertently started repeating those remarks back to other users, leaving a trail of unintended nasty utterances. Microsoft, however, has since corrected the issue by going on to pull offensive tweets and then finally take Tay offline.

Meanwhile, on Twitter, Tay appeared to send the world what could be considered a farewell message.

https://twitter.com/TayandYou/status/712856578567839745

When an attempt is made to contact Tay on Kik, a similar message is left and the bot responds that it is away for “updates at the engineers.”

wp_ss_20160324_0001

Tay is away for updates at the engineers.

At launch, Microsoft had claimed that Tay uses relevant public data which has been “modeled, cleaned and filtered.” So, while it would appear filtering was not in place here, this no doubt displays what happens when the general public interacts with artificial intelligence.

Share This
Further reading: , , ,