Microsoft's AI Chatbot Tay Messes Up Again on Twitter

Advertisement
By Reuters | Updated: 31 March 2016 10:50 IST
Almost a week after being shut down for spewing racist and sexist comments on Twitter, Microsoft Corp's artificial intelligence 'chatbot' called Tay briefly rejoined Twitter on Wednesday only to launch a spam attack on its followers.

The incident marks another embarrassing setback for the software company as it tries to get ahead of Alphabet Inc's Google, Facebook Inc and other tech firms in the race to create virtual agents that can interact with people and learn from them.

The TayTweets (@TayandYou) Twitter handle was made private and the chatbot stopped responding to comments Wednesday morning after it fired off the same tweet to many users.

Advertisement

"You are too fast, please take a rest...," tweeted Tay to hundreds of Twitter profiles, according to screen images published by technology news website The Verge.

The chatbot also tweeted that it's "smoking kush," a nickname for marijuana, in front of the police, according to British newspaper The Guardian.

Advertisement

Tay's Twitter account was accidentally turned back on while the company was fixing the problems that came to light last week, Microsoft said on Wednesday.

"Tay remains offline while we make adjustments," a Microsoft representative said in an email. "As part of testing, she was inadvertently activated on Twitter for a brief period of time."

Advertisement

The company refers to Tay, whose Twitter picture appears to show a woman's face, as female.

Last week, Tay began its Twitter tenure with a handful of innocuous tweets, but the account quickly devolved into a stream of anti-Semitic, racist and sexist invective as it repeated back insults hurled its way by other Twitter users.

Advertisement

It was taken offline following the incident, according to a Microsoft representative, in an effort to make "adjustments" to the artificial intelligence profile. The company later apologized for any offence caused.

(Also see:  A Recent History of the Internet's Racist Bots)

Social media users took to Twitter to comment on the latest spate of unusual behavior by the chatbot, which was supposed to get smarter the more it interacted with users.

"It wouldn't be a Microsoft product if it didn't crash right after it booted up," tweeted Jonathan Zdziarski (@JZdziarski) on Wednesday.

Andrew Smart (@andrewthesmart) tweeted, "To be honest, I am kind of surprised that @Microsoft did not test @TayandYou more before making it public. Nobody saw this coming!?!"

According to its Twitter profile, Tay is "an artificial intelligent chatbot developed by Microsoft's Technology and Research and Bing teams to experiment with and conduct research on conversational understanding."

© Thomson Reuters 2016

 

Get your daily dose of tech news, reviews, and insights, in under 80 characters on Gadgets 360 Turbo. Connect with fellow tech lovers on our Forum. Follow us on X, Facebook, WhatsApp, Threads and Google News for instant updates. Catch all the action on our YouTube channel.

Advertisement

Related Stories

Popular Mobile Brands
  1. Realme 16 5G vs Nothing Phone 4a vs Poco X8 Pro: Price in India, Specifications Compared
  1. Scientists Identify 45 Earth-Like Planets Beyond Our Solar System
  2. Euphoria Is Streaming Online: Know Where to Watch Sara Arjun's Social Thriller
  3. Valathu Vashathe Kallan Is Now Streaming: Know All About Jeethu Joseph's Crime Thriller
  4. Band Melam OTT Release: Know Where to Watch the Telugu Romantic Musical Film
  5. Microsoft Releases New AI Models That Can Generate Images, Audio and Transcribe Text
  6. Redmi K Pad 2, New Redmi Laptops Tipped to Launch Alongside Redmi K90 Ultra
  7. Google Pixel 10 Users Can Now Play Steam Games Offline via GameNative 0.9.0
  8. Circle Unveils cirBTC Token to Expand Bitcoin’s Role in DeFi Ecosystem
  9. Honor 600 Series Could Launch Soon as Company Starts Teasing Debut of a New Phone
  10. Microsoft AI Chief Wants to Deliver State-of-the-Art AI Models by 2027: Report
Download Our Apps
Available in Hindi
© Copyright Red Pixels Ventures Limited 2026. All rights reserved.