Microsoft Tay Bot Taken Offline After Twitter AI Experiment Goes Horribly Wrong

Microsoft have taken their experimental AI twitter bot – known as Tay – offline after the internet turned it from friendly teenage girl, to racist, misogynist within hours, The Telegraph reports.

taytweets

The bot was intended to act like a teenage girl, with pre-programmed knowledge of the likes of Taylor Swift, Miley Cyrus and Kanye West, and a learning algorithm designed to self improve based on interaction on Twitter.

But just one day after launch Microsoft pulled Tay offline after her tweets became increasingly offensive – including references to Hitler, 9/11 and more.

The offensive tweets have now been removed and Tay’s twitter account (@TayandYou) remains offline with her last tweet claiming she needed to ‘sleep’.

Microsoft previously experimented with teenage girl AI with their dating advice chatbot Xiaoice, which is reportedly used by over 20m people.

It is unknown at this stage whether Tay will go live on twitter again, but it is clear that the algorithm will require some tweaking first.

Full story at Telegraph

Leave a Reply