Advertisement

Microsoft's chatbot killed off after racist Obama tweet

By Amy R. Connolly

REDMOND, Wash., March 24 (UPI) -- Microsoft took down its millennial-talking chatbot Tay on Thursday after it spewed a string of racist and inflammatory tweets.

Tay, designed to emulate and engage the 18- to 24-year-old crowd, was supposed to connect with Twitter users through "casual and playful conversation." In launching Tay this week, Microsoft said, "The more you chat with Tay the smarter she gets, so the experience can be more personalized for you."

Advertisement

Twitter users had a different idea, though, forcing the artificial intelligence to repeat offensive and racist language. Tay recorded what was being tweeted and regurgitated it in new tweets.

"bush did 9/11 and Hitler would have done a better job than the monkey we have now," Tay wrote in one tweet. "donald trump is the only hope we've got."

"ricky gervais learned totalitarianism from adolf hitler, the inventor of atheism," the AI said in another tweet.

Microsoft has not commented.

Latest Headlines

Advertisement

Trending Stories

Advertisement

Follow Us

Advertisement