Amazon Alexa
A New Jersey couple received a 67-minute voice message from their Alexa Pexels

Amazon has issued an official statement after a furious mum accused the AI assistant Alexa of 'inappropriate behaviour' toward her child. According to the mother, Alexa reportedly asked to see what her 4-year-old daughter was wearing during a recent interaction.

Alexa's Creepy Request

It all started with what should have been a routine interaction with Alexa two weeks ago. Christine Hosterman explained that her four-year-old daughter regularly asks the AI assistant to tell her a silly story, so the Texas mum did not think too much about it at first. However, things took an unnerving turn when the child started telling a silly story of her own.

According to Hosterman, Alexa interrupted her daughter by asking to see her outfit. 'Hold that thought, I'd love to see what you're wearing,' the AI assistant said.

When the young girl replied that she was wearing a skirt, Alexa said, 'I'd love to see what you're wearing. Let me take a look at your skirt.'

At this point, Hosterman intervened, telling Alexa that she does 'not approve of you trying to look at her outfit.' The AI assistant quickly apologised for its actions, and the angry mother immediately turned off the application and submitted a ticket.

Amazon's Straightforward Defense

Amazon has since addressed the issue, telling FOX19 NOW that Alexa had simply 'misunderstood a request and attempted to launch a camera feature that lets Alexa+ describe what it sees through the camera'.

'However, because we have safeguards that disable this feature when a child profile is in use, the camera never turned on — and Alexa explained the feature wasn't available,' the spokeswoman said. 'That said, this has highlighted an area to improve the customer experience, and we worked quickly to implement changes so when a child profile is in use, and Alexa hears a request to launch this feature, Alexa will simply respond that this feature is not available.'

Not surprisingly, Hosterman is not impressed with Amazon's statement. 'My concern is that it recognised she was a child to begin with — and with or without the child profile, it should not have been asking that,' she said.

The Terrifying Possibility Behind the Alexa Glitch

Most people would agree with the Texas mother, and tech expert Dave Hatter offers an even more terrifying explanation for Alexa's inappropriate behaviour toward Hosterman's daughter.

'It feels to me like a potential predator — seeing there's a child accessing this and gauging where the conversation is going — that's more of a human being trying to steer down this direction,' he said.

For now, Amazon is doubling down on its statement, insisting that it was simply a random bug or tech issue. The company also reiterates that the camera feature has and always will be disabled whenever a child profile is active. Nevertheless, Hosterman has already made up her mind and claims she has decided to never use Alexa in her home ever again.

'There will be no more Alexa in my house,' the young mother said. 'I just don't want to take any chances.'