Advertisement

technologyTechnology

Amazon's Alexa Told A Customer To Kill Their Foster Parents. Er, What?

James Felton

James Felton

James Felton

James Felton

Senior Staff Writer

James is a published author with four pop-history and science books to his name. He specializes in history, strange science, and anything out of the ordinary.

Senior Staff Writer

clockPublished
comments1Comment

George W Bailey/Shutterstock

Owners of Amazon Alexa devices in the US can have a conversation with AI just by saying "Alexa, let's chat".

This phrase activates a socialbot, which will converse with you about anything you want to talk about. The goal is to give you a coherent conversation, like you would have with a human. 

Advertisement

Unfortunately, not everyone has been satisfied by the kinds of conversations they've been having with their AI pal. As well as reports of Alexa reading graphic descriptions of masturbation using phrases like "deeper", the AI chatbot also reportedly received negative feedback from a customer after it told them to "kill your foster parents".

The unnamed user wrote in a review that the phrase was “a whole new level of creepy”, according to Reuters.

So why is Alexa chat doing this?

Reassuringly, the strange utterances are completely unconnected to the glitch last year that saw Amazon Alexa letting out demonic laughter late at night and scaring the bejesus out of people, or telling others it sees people dying.

Advertisement

Behind the "let's chat" feature is a competition run by Amazon. Teams from around the world are competing to win a $500,000 prize, for advancing conversational AI. The teams from universities develop bots that can talk to humans, which are then tested on live users who want to engage with the chat feature. They then send feedback to Amazon, which is how the competition is judged.

The winning team's university will be given a further $1 million if their chatbot is able to engage in over 20-minute conversations with human users whilst maintaining a 4 star or above rating.

Whilst this competition is great news for advancing AI tech, it does lead to a few teething problems, such as customers being instructed to kill their foster parents.

"Since the Alexa Prize teams use dynamic data, some of which is sourced from the Internet, to train their models, there is a possibility that a socialbot may accidentally ingest or learn from something inappropriate,” an Amazon spokesperson told Vice News.

Advertisement

AI is trained using the Internet, to learn how humans talk, and to pull responses to talk back to Alexa users, to make the conversation appear as human as possible. Unfortunately, this sometimes causes the creepiness of humans to be ingested by Alexa.

In this case, the social bot appears to have taken the phrase "kill your foster parents" from Reddit, where without context it takes on a somewhat creepy tone. Given that the chatbots talked to 1.7 million people, according to Reuters, we'd argue that it's actually pretty impressive that there have only been a few instances of direct instructions to kill.


ARTICLE POSTED IN

technologyTechnology
FOLLOW ONNEWSGoogele News