Microsoft's AI Chatbot Tay Just Became Spectacularly Racist, and it's All Our Fault

By Aatif Sulleyman on at

Last night, we wrote a story about Microsoft creating an insufferable AI chatbot called Tay, which was basically designed to chat shit on the internet all day. “The more you talk the smarter Tay gets,” read the machine's Twitter bio. Not too hard to believe, considering the sheer amount of guff the bot had been posting online. However, nobody saw this coming.

Microsoft’s has had to shut down the chatbot -- temporarily, at least -- after it started spewing out racist and anti-Semitic messages. Check out the one below.

Another tweet compared Rick Gervais to Hitler and expressed Tay's hatred of Jewish people. So what went wrong? Was the account hacked? Was Tay not a bot at all, but actually a neo-Nazi locked up in Microsoft's testing labs? Nope. The Twitter community needs to take the blame/credit for Tay’s sudden and spectacular meltdown.

Thanks to machine learning technology, Tay appears to absorb all of the information contained in the messages sent to it, and relay it in the form of fresh updates. As it goes, loads of the people who've been engaging with Tay today haven't been on their best behaviour. Racism is clearly no laughing matter, but it's hard not to raise a smile when you consider the entire situation. Internet, you continue to amaze and disturb us. [Independent]