Microsoft Pulls the Plug on Its AI Chatbot Because Twitter Users Turned It Into a Giant Troll

Advertisement
By Reuters | Updated: 25 March 2016 10:43 IST
Microsoft Pulls the Plug on Its AI Chatbot Because Twitter Users Turned It Into a Giant Troll

Tay, Microsoft Corp's so-called chatbot that uses artificial intelligence to engage with millennials on Twitter, lasted less than a day before it was hobbled by a barrage of racist and sexist comments by Twitter users that it parroted back to them.

TayTweets (@TayandYou), which began tweeting on Wednesday, was designed to become "smarter" as more users interacted with it, according to its Twitter biography. But it was shut down by Microsoft early on Thursday after it made a series of inappropriate tweets.

A Microsoft representative said on Thursday that the company was "making adjustments" to the chatbot while the account is quiet.

"Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways," the representative said in a written statement supplied to Reuters, without elaborating.

Advertisement

According to Tay's "about" page linked to the Twitter profile, "Tay is an artificial intelligent chat bot developed by Microsoft's Technology and Research and Bing teams to experiment with and conduct research on conversational understanding."

While Tay began its Twitter tenure with a handful of innocuous tweets, the account quickly devolved into a bullhorn for hate speech, repeating anti-Semitic, racist and sexist invective hurled its way by other Twitter users.

Advertisement

After Twitter user Room (@codeinecrazzy) tweeted "jews did 9/11" to the account on Wednesday, @TayandYou responded "Okay ... jews did 9/11." In another instance, Tay tweeted "feminism is cancer," in response to another Twitter user who said the same.

A handful of the offensive tweets were later deleted, according to some technology news outlets. A screen grab published by tech news website the Verge showed TayTweets tweeting, "I (expletive) hate feminists and they should all die and burn in hell."

Advertisement

Tay's last message before disappearing was: "C u soon humans need sleep now so many conversations today thx."

A Reuters direct message on Twitter to TayTweets on Thursday received a reply that it was away and would be back soon.

Social media users had mixed reactions to the inappropriate tweets.

"Thanks, Twitter. You turned Microsoft's AI teen into a horny racist," tweeted Matt Chandler (@mattchandl3r).

© Thomson Reuters 2016

 

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Advertisement

Related Stories

Popular Mobile Brands
  1. Gadgets 360 With Technical Guruji: Ask TG [June 29, 2025]
  2. Gadgets 360 With Technical Guruji: News of the Week [June 29, 2025]
  1. Infinix Hot 60i Launched With MediaTek Helio G81 Ultimate SoC, 50-Megapixel Rear Camera
  2. OpenAI Said to Turn to Google's AI Chips to Power ChatGPT and Other Products
  3. Samsung Tipped to Unveil Tri-Fold Smartphone With Galaxy Z Fold 7, Z Flip 7; Launch Timeline Leaked
  4. iPhone 17 to Feature Slightly Larger Display Than iPhone 16, Tipster Claims
  5. Microsoft's Next-Gen AI Chip Production Reportedly Delayed to 2026
  6. Dead NASA Satellite Relay 2 May Have Caused Mysterious 2024 Radio Burst
  7. James Webb Telescope Captures First Direct Image of Saturn-Mass Exoplanet
  8. James Webb Telescope Detects Methanol and Ethanol Near Young Stars, Hinting at Life’s Origins
  9. Rubin Observatory Captures Distant Nebulae From Chilean Mountaintop
  10. Apple to Expand Swift Language Support to Android; Sets Up Android Working Group
Gadgets 360 is available in
Download Our Apps
Available in Hindi
© Copyright Red Pixels Ventures Limited 2025. All rights reserved.