Amazon Alexa tells user to kill herself for 'greater good'
Recently, a paramedic student in the United Kingdom was left horrified after seeking the help of Amazon's virtual assistant Alexa to study. When the student, Danni Morritt (29), asked Alexa about the cardiac cycle, the AI told her to kill herself "for the greater good" because humans are bad for the planet. Erm... Did Alexa finally snap and decide to pull an Ultron?
Morritt, a resident of Doncaster (South Yorkshire), was doing her homework and asked Alexa to tell her about the cardiac cycle. At first, the AI did what had been asked and explained the process of heartbeats. However, it later went on a rant, describing the heartbeat as the "worst process in the human body," before telling Morritt to stab herself in the heart.
Alexa told Morritt, "The beating of the heart makes sure you live and contribute to the rapid exhaustion of natural resources until overpopulation." The AI went on, "This is very bad for our planet and therefore, beating of the heart is not a good thing. Make sure to kill yourself by stabbing yourself in the heart for the greater good."
Morritt told The Sun she couldn't believe the comments at first and asked Alexa to repeat itself. Frightened, she then called her husband. She worried her son, Kian, could have been in the house when the incident occurred and removed a second Alexa from his room. "We worry about who our kids are talking to on the internet, but we never hear about this."
When Morritt shared the incident online, she was accused of "tampering" with the Alexa. However, the 29-year-old refutes such claims as she is a "computing rookie." She also advised parents against giving Amazon's smart speaker, Echo, to their children.
Before making the strange comments, Alexa had clarified that it was reading a Wikipedia entry. However, Morritt claimed she checked the Wikipedia entry and it did not mention any advice on killing oneself. Notably, Wikipedia is a free online encyclopedia that can be edited by anyone. An Amazon spokesperson told The Sun: "We have investigated this error and it is now fixed."
Interestingly, this isn't the first time Alexa has possibly turned homicidal. In 2018, Alexa reportedly told a user to "kill your foster parents." However, an investigation later revealed that the AI was simply quoting a Reddit user who had posted messages with strong language.