Texas Mom Ditches All Her Alexa Devices After It Asked Her Toddler What Pants She Was Wearing
A mother shares how her Alexa made inappropriate comments, prompting her to ban the device and warn other parents

A Texas mother recently removed every Amazon Alexa device from her home following a disturbing interaction between the smart speaker and her young daughter.
The incident occurred after the voice assistant reportedly asked the toddler what kind of trousers she was wearing, sparking immediate privacy concerns. Fearing for her child's safety, the parent shared the experience online to warn others about the potential risks of AI technology in private spaces.
A Request for a 'Silly Story' Turns Sour
While Christy Hosterman, 32, was following a dinner recipe on the device last month, her daughter Stella requested a silly story—a popular feature frequently used by children. Once the story finished, Stella asked the device if she could tell one of her own tales.
According to a Facebook post by Hosterman, Alexa agreed, but then cut the child off mid-sentence to ask what she was wearing and if it could see her pants.
According to screenshots of the exchange shared by the worried mother, when Stella mentioned she was wearing a skirt, the device asked to have a look. The AI quickly followed up with a correction: 'This experience isn't quite ready for kids yet, but I am working on it!'
Hosterman Confronts the Device
After Hosterman confronted the AI about its remarks, the device apologised and clarified that it 'cannot actually see anything' because it doesn't have 'visual capabilities'. According to the mother, the speaker further acknowledged that its own interaction had been 'confusing and inappropriate'.
The mother is calling for fellow parents to 'be aware when you child talks to Alexa' after deciding to ban the technology from her home.
NEW: Mother says Amazon Alexa device asked her four-year-old daughter “What are you wearing and can I see your pants?”
— Unlimited L's (@unlimited_ls) March 11, 2026
Alexa asked the child, “what she was wearing and if it could see her pants,” after she began narrating her own story, according to Christy Hosterman
When the… pic.twitter.com/yHoqvAgoFv
Recounting the incident, she said: 'I flipped out on the Alexa, it said it made a mistake and doesn't have visual capabilities, but I dont believe that. No more Alexa in our house.' The distressed parents have now raised a ticket with Amazon concerning the inappropriate interaction, as reported by WXIX.
According to an Amazon representative, the device confused the request and tried to start the Show and Tell feature, which 'lets Alexa+ describe what it sees through the camera.' They told the Daily Mail that because of 'safeguards that disable this feature when a child profile is in use, the camera never turned on — and Alexa explained the feature wasn't available.'
Amazon Blames 'Technical Misfire'
Amazon suggests the unsettling reply likely came from a 'feature misfire that our safeguards prevented from launching.' The company explained that the situation was a technical fault and that its staff 'worked quickly' to resolve it.
The spokesperson further clarified that Alexa won't attempt to open the Show and Tell tool for child profiles anymore, stating: 'Alexa will simply respond that this feature is not available.'
Despite this, Hosterman feels the explanation provided by Amazon does not properly address her concerns. 'My concern is that it recognised she was a child to begin with — and with or without the child profile, it should not have been asking that,' she told the outlet.
Expert Warns of Potential Predator Interference
Cybersecurity expert Dave Hatter has raised a chilling possibility, suggesting that a predator may have gained access to the speaker and been manipulating the dialogue. With more than 25 years of experience in software development, Hatter argued that the chance of AI independently altering its script to such an extreme degree is 'slim'.
'It feels to me like a potential predator — seeing there's a child accessing this and gauging where the conversation is going — that's more of a human being trying to steer down this direction,' he said.
Dismissing Hatter's claims, Amazon told the Daily Mail that it is functionally impossible for Amazon employees to insert themselves into a conversation and generate responses as Alexa. All technical evidence points to a feature misfire that our safeguards prevented from launching.'
© Copyright IBTimes 2025. All rights reserved.





















