This article references image-based abuse, rape, and murder.
The creation of sexually explicit deepfakes has allowed millions of men across the world to fulfil their darkest sexual fantasies that once felt out of reach, while thousands of women have seen their bodily autonomy stripped from them in pixel form.
Defenders of deepfake porn argue that it's a form of creative expression; devoid of any real-life consequences for the people whose images are stolen. We know that's not the truth. Last week, it emerged that security guard Gavin Plumb had accessed and downloaded non-consensual deepfake ‘pornography’ of TV presenter Holly Willoughby while plotting her kidnap, rape and murder.
This case has exposed the grim connection between image-based abuse and violence against women and girls in real life.
“Deepfake porn is a danger towards women – not an innocent fantasy”
The sinister plot against Holly was exposed when an undercover US police officer infiltrated an online group where Plumb shared his plans and attempted to recruit other users to assist him. During their investigation, police uncovered over 10,000 images of Holly in Plumb’s possession, including non-consensual explicit deepfakes. This disturbing discovery laid bare what campaigners and victims have been fighting in their calls for legal reform, that deepfake ‘porn’ is a danger towards women and should not be excused as an innocent fantasy.
We need comprehensive image-based abuse laws – and we need them now.

When I originally began investigating sexually explicit deepfakes for the BBC back in 2022, I spoke with creators of the technology who shrugged off any suggestion that they should require permission, telling me “It’s a fantasy, it’s not real. I don’t really feel that consent is required”.
This is a belief also taken by the men that consume the content, flooding the comment section of any of my TikTok videos that discuss deepfakes with their views, sharing “I’d argue there isn’t a victim since it’s fake” and “Dude can’t even imagine anymore”.
I have spent a depressing number of hours trawling the pits of the manosphere forums where deepfake requests, images and videos are all posted and celebrated as the women are reduced to stacking dolls, their heads placed on top of the other in a game of objectification that men are always destined to win.
In these forums, fantasies that are off-limits in the real world come to life, with sons requesting deepfakes of their mothers, teachers being stripped by their pupils, and Hollywood actors transported into an adult movie scene.
Celebrities are seen as fair game in the deepfake community, with members justifying the creation of these images with the belief that these women have consented to being in the public eye. These attitudes reflect misogynistic rape myths that blame women for male violence.
One post simply requested ‘F*cked on her back please’ alongside an image of a woman fully clothed, holding a baby.

On the most popular deepfake ‘porn’ website, their terms and conditions state which celebrities are, in their eyes, allowed to be deepfaked. Using follower counts and job roles as an indication of fame and eligibility, they list off their figures- 100k on YouTube, 120k on Instagram… and the job roles – movie stars or musicians, politicians and CEO’s… proudly displaying the thought that has gone into their carefully curated guidelines on who will make the cut as a deepfake ‘porn’ star.
But did they spare a second to think of how the women are more than follower counts and paparazzi shots? How they’re mothers, daughters, little girls who had big dreams or university-educated scholars? Women who enjoy snuggling in on a Saturday night with a glass of wine and their favourite Netflix show, or maybe someone who likes to take their dog for long walks and watch the seasons transform the trees. Perhaps they are a woman who makes a bad-ass lasagne – but has to wear goggles when cutting onions – or one who takes centre stage on the dancefloor at every family wedding.
The men who digitally transform these women into their graphic sexual fantasies do not think of their humanity at all because, to them, these women are objects to own. This is the dangerous entitlement Gavin Plumb embodied when he set out to fulfil the sick fantasy of his celebrity crush, fuelled by the explicit deepfakes he was able to access on his screen.
Plumb was sentenced to life in prison with a minimum of 16 years, minus the 280 days he had already served for his plot to kidnap, rape and murder Holly Willoughby.
We're calling on the government to take urgent action.

This case should act as a warning to the government that we cannot drop the ball when it comes to online misogyny and the dangerous beliefs it fuels. It highlights the urgent need for legal reform by adopting GLAMOUR’s campaign in partnership with End Violence Against Women Coalition (EVAW), Not Your Porn and Professor Clare McGlynn, which calls for the introduction of a dedicated, comprehensive Image-Based abuse law to protect women and girls.
“Online abuse is often part of a pattern of offending that can include other forms of violence against women, including kidnap, rape and murder”
Speaking to GLAMOUR, Professor Clare McGlynn shared how the findings had left her disturbed but not surprised: “Deepfake sexual abuse has real repercussions. This case gives the lie to the justifications of deepfake creators that this is ‘fantasy’ material and consent does not matter. I hope it means we never again hear the excuse that deepfake abuse of celebrities is ok, as they are ‘just’ celebrities, and this is what comes with the territory.”
She went on to emphasise why legal reform is a key part of combatting online misogyny: “Criminalising creating sexually explicit deepfakes would target the root cause of this abuse. It would send a message to perpetrators and society that this is wrong and harmful. Importantly, it says to women that we recognise this threat and abuse; we hear you.”
We need comprehensive image-based abuse laws – and we need them now.

Rebecca Hitchen, Head of Policy & Campaigns at the End Violence Against Women Coalition (EVAW), also shared her concerns. “We know that online abuse is often part of a pattern of offending that can include other forms of violence against women, including kidnap, rape and murder," she says. “However, it is deeply harmful in its own right – impacting survivors’ wellbeing, relationships, health, career prospects and their rights and freedoms to take part in public life online, so must be taken extremely seriously.”
Hitchen told GLAMOUR why it is crucial that Parliament act now to prevent tech companies from profiting off of this abuse and pushing violent content through its algorithms: “The current online safety law does not go far enough to tackle this, and other criminal laws are patchy and inconsistent. That’s why we’re calling on the government to strengthen the criminal laws around this abuse, but it must also go much further and improve civil laws so that survivors can take action against perpetrators and tech companies, including securing orders to take down abusive content.”
“Any new law must also sustainably fund the specialist services that provide survivors with vital, often life-saving, support; and ensure that relationships and sex education in schools across the country addresses this type of abuse, so young people are informed about consent and equality and understand the consequences of using deepfake apps. Finally, we’re calling on the law to introduce an online abuse commission to advocate for victims and hold tech companies accountable for the role they play in enabling and facilitating this abuse.”
“Deepfake sexual abuse threatens our democracy and must be taken more seriously.”

Elena Michael, director at Not Your Porn, adds: "Deepfakes are not just part of an abusive toolkit to harm women, they also promote male entitlement to access women’s bodies without consent and to be violent towards women, both online and offline.
"Gavin Plumb is yet another example of this. We desperately need a system that comprehensively tackles image-based abuse, including deepfakes, that understands its links to other forms of male violence and prevents this behaviour.”
Find out more about GLAMOUR’s campaign in partnership with the End Violence Against Women Coalition (EVAW), Not Your Porn and Professor Clare McGlynn, demanding that the government introduces a dedicated, comprehensive Image-Based Abuse law to protect women and girls.
Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK. You can call them on 0345 6000 459.
The Cyber Helpline provides free, expert help and advice to people targeted by online crime and harm in the UK and USA.
In partnership with the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.

