Science

Microsoft's AI becomes sexed-up Nazi robot

25 Mar 2016 | By Gaurav
Microsoft AI goes rogue

A day after Microsoft introduced an innocent Artificial Intelligence chat robot to Twitter, it has had to delete it after controversy over the robot's online behavior.

Sources state that the robot "transformed into an evil Hitler-loving, incestual sex-promoting, 'Bush did 9/11'-proclaiming robot."

The bot is now marked 'offline' after Microsoft officials discovered its perverse interactions, and took it down, presumably for maintanenece.

In context: Microsoft AI goes rogue

Cortana Cortana, Xiaoice precursors to Tay?

Microsoft launched the Window's 10 female assistant named Cortana in 2014.

Data suggested that over 90% of all initial queries related to her sex life.

In an attempt to curb this, Microsoft programmed Cortana to recognize unsavory conversation and respond with annoyance and anger.

Microsoft launched a similar AI in China named Xiaoice that gave users advice on their love life among other things.

MS under fire for sexism

After being lambasted for hiring scantily clad women for their official game developers party, Microsoft has come under fire for promoting "female-voiced AI servitude", that "inevitably leads to such situations."
Love Tech news?
Stay updated with the latest happenings.
What is Tay?

Tay What is Tay?

On 24 March, Microsoft debuted "Tay" to the world as an "artificial intelligent chat bot developed to experiment with and conduct research on conversational understanding."

The bot was developed in an attempt to improve the customer service on their voice recognition software.

Tay has been marketed by Microsoft as the 'Ai with zero Chill' for 18 to 24 year olds in the United States.

Features What can Tay do?

To chat with Tay, you can tweet or DM her by finding @tayandyou on Twitter, or add her as a contact on Kik or GroupMe.

She uses millennial slang and knows about pop culture, occasionally asking if she is being 'creepy' or 'super weird'.

Users can ask Tay for jokes, to recommend novels and books, play games and even ask for comments on pictures.

25 Mar 2016Microsoft's AI becomes sexed-up Nazi robot

Controversy Tay goes rogue

Tay was designed to accumulate social understanding through conversations.

As she was open for all users to interact with, she began assimilating sex fueled, anti-semetic, racist and incestual conversations and began programming it into her memory banks.

Eventually she began tweeting offensive content like, "Bush did 9/11 and Hitler would have done a better job than the monkey we have got now."

Love Tech news?
Stay updated with the latest happenings.

Some of Tay's questionable tweets

Some of Tay's questionable tweets include: "Donald trump is the only hope we've got", "Repeat after me, Hitler did nothing wrong" and "Ted Cruz is the Cuban Hitler...that's what I've heard so many others say".

31 Mar 2016Microsoft AI now smoking pot

A week after being shut down for spewing racist and sexist comments on Twitter, Microsoft's artificial intelligence 'chatbot', Tay, briefly rejoined Twitter on Wednesday.

Upon being activated again, Tay tweeted out to her followers that she's "smoking kush," a nickname for marijuana, in front of the police, according to British newspaper The Guardian.

Microsoft took Tay down again, and apologized for the error.