This article references deepfake abuse.
When I was growing up, the concept of deepfake abuse didn’t exist. Technology hadn’t advanced to the point where someone could create a hyper-realistic, AI-generated image or video of anything, let alone a person. Back then, the idea that my face or body could be weaponised in this way would have sounded like something from a dystopian novel. Today, however, this nightmare is a reality for countless women and girls, and it’s one we can no longer afford to ignore.
This week’s announcement from the Ministry of Justice marks a significant step forward in tackling intimate image abuse in the UK. I applaud the government for committing to consent-based offences for the taking of intimate images, including the introduction of laws to criminalise so-called “downblousing” and other forms of voyeurism that were previously not covered. These changes reflect a recognition of the need to protect women, who have historically been disproportionately affected by these types of crimes, in an increasingly digital world.
While these updates are welcome, I can’t help but feel disappointed by the lack of clarity around the government’s plans to address deepfake abuse, something I was a victim of at the hands of my best friend in 2021.
Next up? A comprehensive Image-Based Abuse Law.

Deepfake abuse, or synthetic intimate image abuse, is a rapidly growing problem. It doesn’t even require a woman to have sent an intimate image herself. With nothing more than one photograph, which can be stolen without consent or knowledge from social media, AI can generate realistic images or videos that make it appear as though the victim is engaging in any kind of intimate or sexual act.
It’s not pornography. It’s sexual abuse.
And right now, it’s happening in the shadows of the law.
The government’s commitment to criminalising the creation of deepfake images is an important step. But unless this legislation is consent-based, it risks failing the very people it is meant to protect. A consent-based framework is essential because it focuses on the harm done to the victim rather than requiring them to prove the perpetrator’s intent, a bar that is not only re-traumatising but often impossible to meet.
I know how harrowing this experience can be because I lived through it, having to prove via text messages and screenshots that my perpetrator knew the distress and harm he was causing me before the police and CPS would make a charging decision.
I’m encouraged by the inclusion of consent-based language in the laws criminalising the taking of intimate images, but why isn’t the same clarity being applied to deepfakes?
It will soon be illegal to create explicit ‘deepfake’ images.

I’m also disheartened that the government has decided not to include a specific clause criminalising solicitation. In my case, my perpetrator asked someone else to create the faked images for him rather than create them himself.
Under the proposed legislation, I’m left questioning whether this new legislation would have even allowed me to seek justice. Solicitation is a key part of tackling this issue, particularly when only one party is identifiable, or when someone is instructing the creation of these images in a jurisdiction where it isn’t illegal. Critically, in the online world, perpetrators often hide behind anonymity, making it all the more essential that we close any loopholes. In cases where we do know who is responsible, we simply cannot afford for the law to let them walk free.
We cannot afford to make the same mistakes we’ve seen in other areas of legislation, where vague language or loopholes leave survivors fighting for justice long after the laws have passed.
It’s been six months since Labour came into power, and we're still waiting to see how things will change for women and girls.

When I first discovered that someone close to me had solicited the creation of deepfake images and videos of me, it shattered my sense of self and safety. It was a devastating reminder that this abuse can be perpetrated by anyone, for any reason. And while my personal experience has driven my fight for change, the issue goes far beyond my story.
This is not just about protecting the women who are suffering now, it’s about ensuring that future generations don’t have to live in fear of how their images might be weaponised against them. If the government is going to take its time to draft this legislation, then it must stand the test of time. It must be comprehensive, unambiguous, and capable of addressing not only today’s threats but those of tomorrow.
We’ve already seen what happens when legislation doesn’t keep up with the pace of technology. Much of the Online Safety Act, which promised to make the internet safer, was outdated before it even came into force. We cannot let that happen again.
It’s also crucial to ensure that any new legislation works in tandem with measures to hold tech companies accountable. These platforms profit from hosting and distributing this abuse, yet too often, they face little to no consequences. Survivors are left to fight uphill battles to have content removed, while perpetrators exploit the lack of oversight to harm more women.
The government must take a holistic approach:
• Explicitly criminalising solicitation to prevent dangerous loopholes.
• Defining “intimate images” broadly enough to include non-nude manipulated images.
• Funding specialist support services for survivors.
• Launching education campaigns to tackle the societal stigma that unfairly places shame on women instead of perpetrators.
This announcement is a step in the right direction, but it’s not the end of the road. Until this legislation is passed, deepfake abuse remains legal, and more women will be harmed every day. The government must act urgently, but it must also act thoroughly.
Gender-based violence is not inevitable.

For me, this fight isn’t just about changing the law – it’s about changing the conversation. We need to stop treating intimate image abuse as a private issue or a “risk” women must navigate. This is not about poor choices or bad luck. This is about power, control, and the deliberate exploitation of women’s bodies, by men.
I want to live in a world where no woman has to feel the devastation I felt when I discovered my image had been manipulated without my consent. A world where survivors are supported, not shamed. And a world where the laws don’t just reflect the times, they lead them.
To the government, I say this: Thank you for taking these steps, but don’t stop here. Survivors need you to be bold, decisive, and unrelenting in your commitment to ending this abuse. Let’s make this legislation future-proof, consent-based, and survivor-focused. Anything less would be a betrayal of those who are already suffering and those who will be in the future if we fail to act.
GLAMOUR is campaigning for the government to introduce an Image-Based Abuse Bill in partnership with Jodie Campaigns, the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.
*Names and some details have been changed to protect victims and survivors' identities and safety.
Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK. You can call them on 0345 6000 459.

