As I was scrolling through X, I came across a photo of three young women on a day out, who happened to be wearing hijabs
'@grok put them in a bikini', read one reply.
‘@grok have her wear a see-through plastic bikini’, said another.
And those were the more vanilla ones.
As I kept scrolling, the demands became more and more explicit, such as semen on their faces and, for some reason, bare feet. And Grok, X's AI chatbot, had obliged.
Several requests specified that all clothes should be removed – except their hijabs.
How the Grok ‘bikini trend’ exposed the men weaponising AI technology to silence and scare women.

Muslim women who have posted pictures of themselves on social media have been targeted by men because they wear hijab, by men who get off on seeing women who have chosen to cover up, stripped naked. A WIRED review of 500 Grok images found that around 5% featured images of women who, as a result of user prompts, were either stripped or made to wear religious or cultural clothing, with modest Islamic wear and Indian saris being the most common examples.
At the heart of this disturbing trend is the intersectionality of racism, misogyny and Islamophobia, where women have had their faith and gender weaponised against them.
These deepfakes deliberately fetishise the hijab, which is a symbol of modesty and plays into stereotypes of Muslim women being submissive and sexually repressed.
It has also once again called into question how safe the online space is for women, especially marginalised women.
“There aren’t many spaces for Muslim women in the real world, partly because of cultural rules, but also, so the online space has given us a lot of freedom that we don’t get in real life to express ourselves,” Ayesha, 19 (who only wanted to give her first name)
“Some of my friends are crazy, though, like they wear hijab IRL, but post pictures of themselves in sexy clothes like the Kardashians.
It feels like every year or so, there’s a new round of cries to “ban the burka!”

“One of my friends posted a video of herself dressed like Lara in Katseye and dancing to Gabriella, which got loads of views. I would have freaked out, but she was like, ‘My mum doesn’t even have Facebook’.
“That’s not me, though. My faith is important to me, and my hijab represents that," she explained.
“I started posting stuff for fun, really, just make-up reviews or trips abroad. Most of my followers are Muslim women like me who are practising Muslims, but want to enjoy fashion and beauty, too. I only have 200 followers, so it's not like I’m a big influencer or anything.
“I got a direct message, and a guy I didn’t know sent me a screenshot and said, ‘Is this you? I didn’t know you were such a ho'".
To her horror, what had been a harmless picture she posted on Instagram from a trip to Morocco had been edited so she was wearing just a bra and thong and posted on an account on X.com, which posts sexualised images of women in hijab. The account, which has now been suspended, had 100k followers.
“My legs gave way beneath me, and I couldn’t breathe. I was trembling and burst into tears. I don’t know whether it is right to say it felt like being raped because I don’t want to sound like I am downplaying what women who have been raped have been through, but it made me feel violated, dirty and disgusting.
“What was really gross was that they got off on seeing me naked except for my hijab, so it felt racist and Islamophobic.”
These images can be weaponised through forced marriage, honour-based violence, and even death.

The account was suspended due to numerous reports by hundreds of users.
Ayesha has now deleted her Instagram account and says she's terrified the image will be seen by family and friends. In the days since, the government has announced it will push through legislation to criminalise the creation of non-consensual sexual deepfakes, which covers the use of Grok. X has also confirmed that it prohibits users from removing clothing from images of real people in jurisdictions where it is illegal.
In a statement, a spokesperson for X said, "We remain committed to making X a safe platform for everyone and continue to have zero tolerance for any forms of child sexual exploitation, non-consensual nudity, and unwanted sexual content.
We take action to remove high-priority violative content, including Child Sexual Abuse Material (CSAM) and non-consensual nudity, taking appropriate action against accounts that violate our X Rules. We also report accounts seeking Child Sexual Exploitation materials to law enforcement authorities as necessary."
You can read X's statement in full here.
Hundreds of X users have prompted Grok to create illegal, sexualised images of women. And thanks to cowardly politicians and regulators, they're getting away with it.

Ayesha's case shows how woefully unprepared for the fallout of the damage unfolding for Muslim women online.
“The idea that a guy halfway around the world could have that image on their phone is sickening. Also, my family is really strict, so if they saw the images, I don’t know what they would do. But I have no way of controlling that. I am going to be living in fear for the rest of my life. I don’t think my life can go back to how it was.”
The trend highlights the layered discrimination that women of colour face, having both their race and religion weaponised against them, something which, as a Black Muslim woman, Amani, 24, is all too familiar: “I'm on social media a lot, and I’m quite feisty, so I get into arguments with racists, which put me in the firing line. I’d posted a picture of myself, and I get that it was dumb, but it's second nature, and you don’t really think about what could go wrong.
“The next thing I know, all these random guys were asking Grok to put me in underwear and God knows what else. When I clicked on their accounts, a lot of them had posted stuff from Andrew Tate and Tommy Robinson, so it was obvious my religion and race were why they were doing it.
“I was quite lucky because my workplace was really supportive. Funnily enough, my online followers helped me by reporting it themselves and giving me advice on what to do, like reporting it to Ofcom.”
“What's really stupid is that a lot of non-Muslim girls post images on social media in lingerie and bikinis, which is fine, as it is their choice and their agency, and people will ask to put them in hijab with their bikinis. Like, why would you even do that? They just really hate Muslim women.”
“It is horrible and disgusting, and I know it can have real-life implications on Muslim women, especially if they come from strict families. Even though my family is pretty chill, they’d be really angry. I’m hoping it gets banned.
Cool and edgy or oppressive and patriarchal? Depends whose head it's on.

Organisations in the VAWG space are calling for more action to protect victims from marginalised communities. Annie Gibb, a practising Muslim herself and CEO of Amour Destine, which supports women from marginalised communities who have faced traumatic domestic abuse and sexual violence, described it as “sexual and spiritual violence”.
“It's not only a violation of her body — it’s a violation of her faith. The racialised targeting of Muslim women is both sexual and spiritual violence, deliberately exploiting what they hold sacred to shame, control, and silence them.
“This particular harm is compounded by fear of community judgment, family repercussions, and a lack of understanding or protection from mainstream systems.
“Digital spaces are not separate from real life — the violence experienced online follows women into their homes, communities, workplaces, and places of worship. When faith, identity, and bodily autonomy are attacked simultaneously, the consequences are far-reaching.
“It presents the same power imbalances present in grooming and exploitation, stripping individuals of consent and agency. For many survivors, this can resurface trauma and undermine feelings of safety and dignity.”
For a lot of Muslim women, the online space has given them the freedom to express themselves that they don’t always get IRL, so having that freedom taken away from them shows how Grok AI can be a force of oppression.
While some people have argued that a ban on it is a ban on freedom of speech, if your freedom of speech is dependent on oppressing marginalised women, it’s not freedom.
Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK. You can call them on 0345 6000 459.
Glamour is campaigning for the government to introduce an Image-Based Abuse Law in partnership with Jodie Campaigns, the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.
