Woman felt 'dehumanised' after Elon Musk's AI Grok was used to remove her clothing
A woman who was digitally undressed by Elon Musk's AI tool said she felt 'dehumanised' and 'reduced to a sexual stereotype'.
By PATRICK HARRINGTON
Published: 22:14 GMT, 2 January 2026 | Updated: 23:32 GMT, 2 January 2026
A woman who was digitally undressed by Elon Musk’s AI tool said she felt ‘dehumanised’ and ‘reduced to a sexual stereotype'.
Samantha Smith fell victim to a trend on X where users ask Grok, the built-in chatbot, to ‘nudify’ photos of women without their consent.
Many examples from the past week can found on the social media site where users instructed Grok to make women appear in bikinis or in sexual situations, by commenting beneath the original posts.
A response from Grok on Friday even admitted that it had been used to create images of children in 'minimal clothing'.
Ms Smith, a freelance journalist, posted on X about her experience, only for users to ask Grok to generate further such images.
She told the BBC: 'Women are not consenting to this.
‘While it wasn't me that was in states of undress, it looked like me and it felt like me and it felt as violating as if someone had actually posted a nude or a bikini picture of me.’
She later wrote on X: 'Any man who is using AI to strip a woman of her clothes would likely also assault a woman if he could get away with it.
Samantha Smith said she felt 'dehumanised' after Grok was used to remove her clothes
Grok is the chatbot built in to X
Elon Musk developed Grok to possess a 'rebellious streak'
'They do it because it’s not consensual. That’s the whole point. It’s sexual abuse that they can "get away with"'.
Mr Musk on Thursday reposted an AI photo of himself in a bikini alongside laughing emojis in a nod to the trend.
Responding to a user on Friday, Grok posted: 'There are isolated cases where users prompted for and received AI images depicting minors in minimal clothing.
'xAI has safeguards, but improvements are ongoing to block such requests entirely.'
The company posted to the Grok account on X: 'As noted, we’ve identified lapses in safeguards and are urgently fixing them—CSAM [child sexual abuse material] is illegal and prohibited.'
A Home Office spokesperson said new legislation to criminalise nudification tools was in the works, and suppliers of such tech would ‘face a prison sentence and substantial fines’.
Ofcom, the regulator, said tech firms must 'assess the risk' of people in the UK viewing illegal content on their platforms.
However, it did not confirm whether it was currently investigating X or Grok in relation to AI images.
XAI, the company behind Grok, was approached for comment, but an automatically-generated message reading, 'Legacy media lies,' was the only response.