Microsoft deletes racist, genocidal tweets from AI chatbot Tay - Business Insider
Tay (bot) - Wikipedia
Microsoft chatbot is taught to swear on Twitter - BBC News
Racist Twitter Bot Went Awry Due To “Coordinated Effort” By Users, Says Microsoft
Microsoft's racist teen bot briefly comes back to life, tweets about kush
Microsoft silences its new A.I. bot Tay, after Twitter users teach it racism [Updated] | TechCrunch
Microsoft scrambles to limit PR damage over abusive AI bot Tay | Artificial intelligence (AI) | The Guardian
Microsoft's disastrous Tay experiment shows the hidden dangers of AI — Quartz
Microsoft AI bot Tay returns to Twitter, goes on spam tirade, then back to sleep | TechCrunch
Microsoft's Artificial Intelligence Tay Became a 'Racist Nazi' in less than 24 Hours
TayTweets: How Far We've Come Since Tay the Twitter bot
Microsoft Research and Bing release Tay.ai, a Twitter chat bot aimed at 18-24 year-olds » OnMSFT.com
Microsoft's artificial Twitter bot stunt backfires as trolls teach it racist statements | The Drum
The Internet turns Tay, Microsoft's millennial AI chatbot, into a racist bigot | PCWorld
Microsoft's new AI chatbot Tay removed from Twitter due to racist tweets.
Tay the 'teenage' AI is shut down after Microsoft Twitter bot starts posting genocidal racist comments that defended HITLER one day after launching | Daily Mail Online
Trolls turned Tay, Microsoft's fun millennial AI bot, into a genocidal maniac - The Washington Post
Microsoft chatbot is taught to swear on Twitter - BBC News
Requiem for Tay: Microsoft's AI Bot Gone Bad – The New Stack
taytweets hashtag on Twitter
Requiem for Tay: Microsoft's AI Bot Gone Bad – The New Stack