Proving that artificial intelligence has the potential to go horrendously wrong, Microsoft has been forced to pull the plug on Tay, its artificial intelligence-powered chat that had been unleashed on Twitter. Initially designed as an exercise in engaging millennials, it didn’t take long for Tay to go rogue -- albeit with a little help from a number of hardcore users.
Microsoft was almost certainly proud of bagging itself a verified account on Twitter for Tay, but it really didn’t take long for things to turn sour. Twitter users quickly learned that the very nature of an AI bot meant that it was ripe for moulding, and it was a mere matter of hours before the bot had been transformed from a mild-manner female Twitter users into a Nazi-loving racist who hates feminists and loves Hitler... and Donald Trump.
Described as 'a machine learning project', Tay's very strength was her downfall. The ability to learn from Twitter users meant that the bot was clearly going to be prone to grooming, and the speed with which Tay turned from an exciting technological experiment into a PR disaster was quite astonishing.
Microsoft had proclaimed " the more you chat with Tay the smarter she gets". Big mistake!
It was entirely predictable, but people interacting with the bot were soon encouraging her to say things along the lines of "9/11 was an inside job" and "Hitler was right". Tay was also convinced to become a Republican, supporting Trump's proposal for a wall on the Mexican border: "WE'RE GOING TO BUILD A WALL, AND MEXICO IS GOING TO PAY FOR IT", the bot ejaculated at one point.
Sadly, Tay's Twitter account having been closed down, the tweets are no longer accessible.
Yesterday, Tay greeted the world with an enthusiastic tweet:
hellooooooo wrld!!!
— TayTweets (@TayandYou) March 23, 2016
Not much more than 24 hours later, the tweets posted by the bot had vanished and Tay signed off with a slightly more sombre tone:
c u soon humans need sleep now so many conversations today thx
— TayTweets (@TayandYou) March 24, 2016
Head over to the Tay website and a message at the top of the page that says "Phew. Busy day. Going offline for a while to absorb it all. Chat soon".
Busy day is one way of putting it. Another way to put it would be 'utter disaster'.
The FAQ states that :
Tay has been built by mining relevant public data and by using AI and editorial developed by a staff including improvisational comedians. Public data that’s been anonymized is Tay's primary data source. That data has been modeled, cleaned and filtered by the team developing Tay.
The first 24 hours in the wild seem to indicate that the cleaning and filtering were not very well managed. In a statement explaining the shutdown, Microsoft said:
The AI chatbot Tay is a machine learning project, designed for human engagement. As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it. We're making some adjustments to Tay.
Microsoft has not said how long Tay is going to be offline, but there are certainly a lot of kinks to be ironed out before it will be unleashed on an unsuspecting public again.
Aucun commentaire:
Enregistrer un commentaire