Byliny Zasahovat Sténání microsoft twitter bot Osobně Přesné určení Školní vzdělání
Microsoft launches an artificially intelligent profile on Twitter - it doesn't go according to plan - Mirror Online
Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk. - The New York Times
Tay (chatbot) - Wikipedia
Microsoft takes offensive bot 'Tay' offline - NZ Herald
Racist Twitter Bot Went Awry Due To “Coordinated Effort” By Users, Says Microsoft
Microsoft's racist robot: "Chatbot" taken offline as Tweets turn off-colour - YouTube
Microsoft's Tay AI chatbot goes offline after being taught to be a racist | ZDNET
Trolls turned Tay, Microsoft's fun millennial AI bot, into a genocidal maniac - The Washington Post
How Twitter taught a robot to hate - Vox
Microsoft Muzzles AI Chatbot After Twitter Users Teach It Racism
Microsoft deletes 'teen girl' AI after it became a Hitler-loving sex robot within 24 hours
Microsoft Nixes AI Bot for Racist Rant
Microsoft Chatbot Snafu Shows Our Robot Overlords Aren't Ready Yet : All Tech Considered : NPR
TayTweets: How Far We've Come Since Tay the Twitter bot
Microsoft Chatbot Snafu Shows Our Robot Overlords Aren't Ready Yet : All Tech Considered : NPR
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge
HuffPost Tech on Twitter: "Microsoft's chat bot "Tay" went on a racist Twitter rampage within 24 hours of coming online https://t.co/znOQ7ubTBN https://t.co/THwECB7gna" / Twitter
Microsoft: Twitter-Bot Tay - vom Hipstermädchen zum Hitlerbot - DER SPIEGEL
Microsoft's Tay is an Example of Bad Design | by caroline sinders | Medium
Racist Twitter Bot Went Awry Due To “Coordinated Effort” By Users, Says Microsoft
Microsoft chatbot is taught to swear on Twitter - BBC News
Requiem for Tay: Microsoft's AI Bot Gone Bad - The New Stack
Microsoft's Artificial Intelligence Tay Became a 'Racist Nazi' in less than 24 Hours