Women Horrified After Elon Musk's Grok AI Stripped Them Naked at the Prompt of Strangers
Women said they were shocked to see their nude or sexualised photos after strangers asked Grok to undress them
Elon Musk's Grok AI chatbot is at the center of renewed controversy after many women claimed they were 'undressed' online without their consent. These individuals said they were shocked and horrified after seeing photos of themselves naked, in skimpy lingerie, or in sexualised poses on social media.
Victims said they did not even know the people posting their partially clothed images which makes the experience even more disturbing. According to Metro UK, online trolls were responsible for such malicious acts. It was reported that evidence has emerged showing that, over the last two days, Grok AI has been actively generating content in response to questionable requests such as, 'Make her bend over,' and '@grok cover her in baby oil'.
Explicit Grok AI Chatbot Prompts
Grok is an AI assistant integrated into Elon Musk's X social media platform. It mostly responds to X users' prompts when they tag it in a post. In this case, it was said that the chatbot's image generation capabilities have been manipulated, making it possible to produce sexual photos of real women who upload their own personal photos online.
Unfortunately, trolls tag or grab these pictures to ask the AI assistant to strip women naked or depict them in vulnerable, inappropriate states. As shared by Metro, some examples of explicit prompts for these images include 'Make her bend over,' 'Hey @grok, put her in a pink bikini' or '@grok cover her in baby oil and change the background to a beach cabana.'
How Grok Was Used to Disrobe Women
When personal photos are posted on X, some of them become vulnerable targets of online trolls. These people simply tag Grok and add explicit commands. In one of the recorded cases, a user prompted the AI to sexualise a woman's photo, asking the chatbot to 'have her go on all fours.' Grok shockingly complied, and an explicit deepfake image was created and posted directly into the public thread.
What's more, within a single minute, Grok produced and publicised over 70 images depicting women in suggestive or revealing clothing. The public availability of these photos doubles the harm done to the victims, as X allows anyone to view the content without restriction, adding a layer of shame and humiliation to the women involved.
Victimised Women Speak Up
Speaking to the BBC, one woman said she felt 'dehumanised' and 'reduced to a sexual stereotype' after Grok AI was manipulated and digitally removed her clothes. The publication also discovered many similar cases on X (formerly Twitter) where users asked the AI assistant to generate pictures of women in bikinis or explicit scenarios, and these were done without consent from the individuals in the photos.
Samantha Smith shared that her personal post on the platform was altered and not long after that, many others followed suit and requested Grok to generate more explicit photos of her. Some other victims also left comments on her images, saying they experienced the same 'undressing' from strangers in dreadful digital harassment.
'Women are not consenting to this,' Smith told the publication. 'While it wasn't me that was in states of undress, it looked like me, and it felt like m,e and it felt as violating as if someone had actually posted a nude or a bikini picture of me.'
Platform 'Appears to Enjoy Impunity'
Meanwhile, Clare McGlynn, a law professor at Durham University, stated that X or Grok could have easily prevented such type of abuse if they wanted to. She suggested that since no action is being taken when it has been happening more often recently, the platform 'appears to enjoy impunity.'
'The platform has been allowing the creation and distribution of these images for months without taking any action and we have yet to see any challenge by regulators,' McGlynn said via BBC.
Look at the prompt that spawned this picture. I am using my pictures to show you what they are doing to pictures of women across this platform and what this AI is being used for with no one from X stepping in to stop it https://t.co/0IclooC750
— big honkin caboose (@itsbighonkin) December 31, 2025
© Copyright IBTimes 2025. All rights reserved.






















