Did You Know Taylor Swift Threatened To Sue Microsoft?
American singer-songwriter Taylor Swift threatened to sue Microsoft over a racist chatbot.
According to Microsoft‘s president, Brad Smith in his new book, “Tools and Weapons”, the 10 times Grammy winner tried to take legal action against Microsoft because the name of its now-defunct Twitter chatbot, Tay, was similar to hers.
Smith recalls receiving an email from a legal representative for the singer while on holiday. Here is an excerpt from the book as quoted in the Guardian UK:
“An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: ‘We represent Taylor Swift on whose behalf this is directed to you’,” the tech boss writes.
“He went on to state that ‘the name Tay, as I’m sure you must know, is closely associated with our client.’ No, I actually didn’t know, but the email nonetheless grabbed my attention.
“The lawyer went on to argue that the use of the name Tay created a false and misleading association between the popular singer and our chatbot and that it violated federal and state laws.”
However, the Tay chatbot did not last long. Shortly after it was launched the artificial intelligence tool, which had been designed to learn from conversations it had on social media started tweeting racist statements and making inflammatory comments, some of which expressed support for genocide while others denied the holocaust had happened. Another tweet praised Hitler and claimed the account hated Jews.
Tay was attacked with racist statements by “a small group of American pranksters” as described by Smith. The AI tool soon began repeating the exact same ideas at others:
“Bush did 9/11 and Hitler would have done a better job than the monkey we have now,” it tweeted.
“WE’RE GOING TO BUILD A WALL, AND MEXICO IS GOING TO PAY FOR IT,” it added.
Microsoft swiftly issued an apology and Tay was taken offline less than 18 hours.