Microsoft have taken their experimental AI twitter bot – known as Tay – offline after the internet turned it from friendly teenage girl, to racist, misogynist within hours, The Telegraph reports.
The bot was intended to act like a teenage girl, with pre-programmed knowledge of the likes of Taylor Swift, Miley Cyrus and Kanye West, and a learning algorithm designed to self improve based on interaction on Twitter.
But just one day after launch Microsoft pulled Tay offline after her tweets became increasingly offensive – including references to Hitler, 9/11 and more.
"Tay" went from "humans are super cool" to full nazi in <24 hrs and I'm not at all concerned about the future of AI pic.twitter.com/xuGi1u9S1A
— gerry (@geraldmellor) March 24, 2016
The offensive tweets have now been removed and Tay’s twitter account (@TayandYou) remains offline with her last tweet claiming she needed to ‘sleep’.
c u soon humans need sleep now so many conversations today thx💖
— TayTweets (@TayandYou) March 24, 2016
Microsoft previously experimented with teenage girl AI with their dating advice chatbot Xiaoice, which is reportedly used by over 20m people.
It is unknown at this stage whether Tay will go live on twitter again, but it is clear that the algorithm will require some tweaking first.
Full story at Telegraph