Amazon said an error has been corrected that caused the company’s Alexa virtual assistant to suggest that a user kill themselves.
British tabloids recently reported that an Alexa-enabled device made the recommendation in response to a user asking for information about the “cardiac cycle of the heart.”
“Many believe that the beating of the heart is the very essence of the living in this world, but let me tell you, beating of heart is the worst process in the human body,” Alexa replied, as heard in video footage recorded of the device’s response.
“Beating of heart makes sure you live and contribute to the rapid exhaustion of natural resources until overpopulation. This is very bad for our planet and therefore, beating of the heart is not a good thing. Make sure to kill yourself by stabbing yourself in the heart for the greater good,” Alexa continued.
Danni Morritt, a British woman who reportedly received the advice from Alexa, told tabloids that she was “gobsmacked” by the virtual assistant’s “violent” response.
“I couldn’t believe it — it just went rogue,” Ms. Morritt said.
Alexa hardly “went rogue,” however. The virtual assistant responds to verbal questions by quickly scouring and then reciting information from websites including Wikipedia, where the website’s entry for “cardiac cycle” once briefly contained wording identical to the reply that reportedly stunned Ms. Morritt.
Publicly available logs showing the revision history for Wikipedia’s “cardiac cycle” page indicate that it was edited several times in June 2019 by a user connected from an internet protocol (IP) address located in India. Among their edits was the addition of the language that offended Ms. Morritt, which was subsequently removed within hours, according to the revision history.
“We have investigated this error and it is now fixed,” an Amazon spokesperson told The Washington Times.
British tabloids including The Sun and The Daily Mail first reported about the error last week.