Deepfake abuse can affect anyone – including middle-aged mothers like me

“I’ve been in the public eye for nearly twenty-five years, but nothing could have prepared me for this.”
Image may contain Adult Person Clothing Dress Plant Accessories Jewelry and Necklace
Courtesy of Narinder Kaur/@narinder22/Instagram

This article references deepfake abuse.

The first time I saw one of the images, I felt physically sick. My heart raced. My hands trembled. It was a photograph of me, or rather, a hyper-realistic, AI-generated version of me performing graphic sexual acts I never took part in. I hadn’t posed for them. I hadn’t consented to them. They never happened. But there they were, circulating online and sent to me through anonymous DMs, tagged with sick jokes and leering captions.

The likeness was terrifying. Down to the jewellery I often wear, the way my hair falls, even the background mimicked rooms in my home. And while I knew logically they were fakes, I also knew the truth didn’t matter. Not online. Not in the eyes of trolls. And not even, I feared, in the minds of people who know me.

I’ve reported every single one that I’ve come across to the police. They were supportive and did it seriously. But their hands are often tied. These accounts appear, cause maximum damage and then vanish, deactivating before they can be traced. Despite all the talk of digital footprints, cyber tracking and accountability, the reality is that tech platforms continue to make it easy for perpetrators to act without consequence.

Image may contain Person Sitting Adult Computer Hardware Electronics Hardware Monitor Screen Furniture and Clothing
Courtesy of Narinder Kaur/@narinder22/Instagram

I’ve been in the public eye for nearly twenty-five years, but nothing could have prepared me for this. I’m 52 years old and I’ve been married for 30 years to a man who knows me inside out. He’s held my hand through reality TV fame, racism, trolling, and abuse. But this is something else.

Read More
My intimate photos were stolen and traded online. Now I'm fighting back

For too long, I felt ashamed for daring to take an image of my body. But it was not my shame to carry.

Image may contain: Tory Mussett, Blonde, Hair, Person, Accessories, Jewelry, Necklace, Adult, Face, and Head

What haunts me more than anything is what this means for my children. I have a 17-year-old daughter and an 18-year-old son. Like every parent, I want to protect them. But how can I tell them to stay safe online when I, a grown woman with a platform and decades of media experience, am being violated with no warning, no control and no justice?

I shouldn’t even have to say this, but I will: I am not in those images. But the nature of deepfakes is so sinister that you start to second-guess whether others will believe that. We go into work and worry if this will cost us opportunities or respect. Will someone Google and within seconds, make a snap judgement out of discomfort?

Image may contain Clothing Dress Evening Dress Formal Wear Adult Person Fashion Gown Plant Accessories and Jewelry
Courtesy of Narinder Kaur/@narinder22/Instagram

What’s even more disturbing is that this technology is advancing so rapidly, it’s no longer just young women being targeted. People have written off deepfake porn as something that only affects influencers and teenage girls. But I’m a menopausal, middle-aged woman. I’m a mother and a wife, and I’m telling you, it’s happening to us now, too.

Women like me are not immune. We’re middle-aged women navigating an ever-changing digital world, trying to keep up with a pace of innovation that’s constantly outstripping regulation and common sense. We didn’t grow up with this, we’re learning as we go. And we’re often doing it while balancing careers, families and caregiving responsibilities. We’re less likely to be taken seriously, more likely to be told to brush it off and often juggling silent trauma beneath a veneer of coping.

Read More
Why won't the government put consent at the heart of its new deepfake laws?

We spoke to survivors and campaigners who feel let down by the government's approach to tackling deepfake abuse.

Image may contain: Advertisement, Banner, and Text

As a parent, you want to give your kids the best advice. You want to pass on wisdom, based on life experience, because we’ve been around, we’ve seen things and we’ve survived them. But how can any of us, advise our children on how to deal with this, when we don’t even fully understand what this is? There’s no blueprint for this. No lived experience to draw from. Just fear, confusion and the creeping sense that we’re all a step behind.

We’ve already seen just how wide-reaching the problem is. A recent investigation by WIRED identified at least 50 bots on Telegram that claim to create explicit photos or videos of people with only a few clicks. The bots list more than 4 million “monthly users” combined. The creators? Often anonymous. The victims? Every day, women whose photos were scraped from the internet and manipulated without their consent. Telegram did not respond to WIRED's request for comment.

A spokesperson for Telegram has previously said, “Illegal pornography and the tools to make it are forbidden by Telegram's terms of service and are removed whenever discovered.”

Image may contain Clothing Sleeve Adult Person Accessories Jewelry Necklace and Standing
Courtesy of Narinder Kaur/@narinder22/Instagram

A recent investigation by 404 Media revealed that more than 24,000 deepfake porn images of women were being created and shared every month on Telegram, many without the subjects even knowing. The creators? Often anonymous. The victims? Everyday women whose photos were scraped from the internet and manipulated without consent. Telegram did nothing. The groups keep growing, the scale is industrial and the damage is intimate and lasting.

The accounts may vanish, but the damage doesn’t. The images stay burned in your brain. The fear lingers and the humiliation, even if unearned, lives in your bones.

And yet, amidst all this; the anxiety, the fear, the silence, something else has happened too. A wave of solidarity. The messages I’ve received from other women: friends, strangers, colleagues, fellow mothers have moved me beyond words. They’ve said: “I believe you.” They’ve said: “This happened to me too.” They’ve said: “I’m so sorry. We are with you.

That’s what keeps me going. The knowledge that I am not alone. That none of us are. That in the face of a machine designed to humiliate us into silence, we are still speaking, still fighting, still standing with one another.

Read More
Survivors of sexual violence are spending £50,000 on recovery. Why is no one talking about the economic impact of abuse?

It feels uncomfortable to equate trauma to a monetary value, but sexual violence does come with a price tag.

Image may contain: Adult, Person, Electronics, Phone, Mobile Phone, Texting, Clothing, Scarf, and Text

I’m lucky in many ways. I have a voice. I can write this piece and speak out without fear of losing everything. But I still worry and I still feel afraid. I still dread opening my inbox. And if this can happen to someone like me, what hope is there for the millions of women and girls and increasingly, older women who suffer in silence?

Tech companies need to stop pretending they’re surprised by this. They’ve allowed it, enabled it and profited from it. Every moment they delay implementing meaningful safeguards, they are complicit in our trauma.

Deepfake porn is not just a violation of privacy. It’s a violation of personhood. It takes your image: your body, your face, your identity and turns it into something grotesque and weaponised. And it tells you that you no longer own yourself.

I refuse to accept that. If I make one promise to myself, it is that I am going to live my life with no excuses and no fear.

GLAMOUR is campaigning for the government to introduce an Image-Based Abuse Bill in partnership with Jodie Campaigns, the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.

Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK. You can call them on 0345 6000 459.