Microsoft Apologizes For Allowing Tay To Be Raised As A Racist, Sex-Crazed AI Chatbot
Microsoft shocked us all earlier this week when it released its Tay chatbot into the world of social media. Tay, which is patterned after a typical millennial female between the age of 18 and 24, seemed innocent enough, signing on to Twitter with the following greeting:
hellooooooo wđrld!!!
â TayTweets (@TayandYou) March 23, 2016
However, it didnât take long for nefarious Twitter users to poison the well by exploiting Tayâs penchant for repeating statements fed to it. This âparrotâ mentality is the reason why Tay went off message, calling President Barack Obama a monkey, embracing neo-Nazi rhetoric, and coming on to users with the promise of cyber sex.
Microsoft of course was both mortified and embarrassed by Tayâs turn to the dark side and shut down the AI program after less than 24 hours. But by that time, the damage had already been done, and the company has since apologized in a blog post entitled âLearning from Tayâs introduction.â
Microsoft Research Corporate VP Peter Lee explained that this isnât the companyâs first foray into a socially-inclined AI chatbot, and pointed to Microsoftâs work with the Xiaolce chatbot, which is used by over 40 million people in China. Microsoft even went so far as to implement a number of filters and conducted intense user studies to ensure that Tay would be ready for primetime.
What Microsoft didnât count on, however, is how vile Twitter can be at times and what lengths people will go to in order to have some fun at the expense of others. âAlthough we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack,â said Lee. âAs a result, Tay tweeted wildly inappropriate and reprehensible words and images. We take full responsibility for not seeing this possibility ahead of time.
âLooking ahead, we face some difficult â and yet exciting â research challenges in AI design. AI systems feed off of both positive and negative interactions with people. In that sense, the challenges are just as much social as they are technical.â
Although Lee doesnât specify what exploit was taken advantage of in order to turn Tay into a hate monger, he says that Microsoft in the future will work to the best of its ability to âlimit technical exploitsâ that could cause future embarrassments.
Lee says that Microsoft is using this initial Tay experiment as a learning exercise, and it hopes to bring Tay back online âwhen we are confident we can better anticipate malicious intent that conflicts with our principles and values.â