Vicky Pattison: My Deepfake Sex Tape: What survivors of image-based abuse want you to know

The documentary sees Vicky Pattison create and release a deepfake sex tape, but some survivors think the stunt is a step too far.
Image may contain Vicky Pattison Head Person Face Adult Sad Accessories Jewelry and Necklace
Rob Parfitt/Channel 4

GLAMOUR is calling on the government to protect women and girls from image-based abuse. In partnership with the End Violence Against Women Coalition (EVAW), Not Your Porn, Clare McGlynn, and a brilliant survivor known as Jodie*, we are calling for a dedicated, comprehensive Image-Based Abuse law.

Earlier this year, we had a major win, with the government confirming that it will criminalise the creation of sexually explicit deepfakes. But we know this is just the beginning.

This issue is explored in Channel 4's new documentary, Vicky Pattison: My Deepfake Sex Tape, which sees Vicky create and distribute her own 'deepfake sex tape'. While many have applauded the documentary's bold approach to raising awareness of image-based abuse, some survivors are concerned that the stunt trivialises their trauma.

Here, GLAMOUR's Purpose Editor Lucy Morgan explores the strengths and weaknesses of the documentary, and speaks to survivors of image-based abuse about how the documentary's decision to release a consensual sexually explicit deepfake (and the subsequent backlash) made them feel.


Last night, Vicky Pattison: My Deepfake Sex Tape aired on Channel 4. The documentary, fronted by broadcaster and author Vicky Pattison, examines the “proliferation of deepfake porn and the impact it is having on women and girls.”

In order to “immerse herself in this rapidly evolving violation of privacy” Vicky directs, produces and distributes her own “fully consensual” AI-generated, sexually explicit video.

The documentary follows Channel 4's groundbreaking investigation into so-called ‘deepfake porn’, finding that almost 4000 celebrities are victims of deepfake abuse, meaning their images were artificially mapped onto pornographic content without their consent.

Channel 4's latest documentary opens with Vicky and her husband, Ercan Ramadan, looking at a laptop in their kitchen. A blurred, deepfaked image of the actor Emma Watson engaged in an explicit sexual act. Ercan describes the image as a "very young version" of the actor.

He asks, “Is yours going to be like this?” Vicky, blinking back tears, responds, “No, mine isn't going to be like this – thank God.”

In that moment, Vicky expresses a relief that most survivors of deepfake abuse, including Emma Watson, will never know. She gets to choose when enough is enough, how far is too far.

Image may contain Vicky Pattison Head Person Face Cup Adult Computer Electronics Laptop Pc and Yawning
Potato/Channel 4

Less than five minutes later, an unblurred deepfake image of Scarlett Johansson wearing lingerie is shown, followed by a deepfaked video of Margot Robbie appearing to say, “Let's f*cking do it,” in a different accent.

The crux of the documentary is Vicky's decision to release her own sexually explicit deepfake, the idea being to expose how easy it is to create these images, and the impact it can have. While some survivors of deepfake abuse, including Channel 4's own newsreader Cathy Newman and politician Cara Hunter, supported this decision as a “bold way to shine a much-needed spotlight on the issue”, there are also many survivors who feel the stunt is a step too far.

Instagram content

Jodie*, a survivor of deepfake abuse and one of GLAMOUR's Stop Image-Based Abuse campaign partners, described feeling “deeply offended, let down, and disappointed” when she learned from pre-broadcast publicity of Channel 4's decision to create and distribute a “deepfake” video of Vicky Pattison" for the documentary.

Last year, Jodie shared her story of being deepfaked by her best friend to help raise awareness of how the laws in the UK must change to better support survivors of image-based abuse. GLAMOUR worked with Jodie (as well as EVAW, Not Your Porn, and Professor Clare McGlynn) to create a petition calling on the government to introduce a dedicated Image-Based Abuse Law, which currently has almost 70k signatures.

Georgia*, another survivor of deepfake abuse, said on hearing the publicity about the planned stunt that while she believes it is an attempt to “do the right thing”, the idea is “insensitive to survivors of deepfake abuse”.

Georgia, who hasn't seen the documentary when we speak, says, “You can't experience the initial shock when you find out about the deepfake. You don't know who's created it or why they've done it. It doesn't evoke any of the same feelings […] And then there's the following months of turmoil, chasing the police and tech companies, not knowing how it will impact your work and your relationships.

“There's a whole list of things that add to your trauma, which cannot be replicated if you consent with a TV channel supporting you, supporting your work, supporting your relationship, supporting you emotionally. You can't replicate what real victims have felt.”

For many survivors, the crucial problem is that Vicky consented to the deepfake being made and shared, which is entirely at odds with the realities of deepfake abuse. Research by Deep Trace Labs estimates that 96% of deepfake videos online are non-consensual, and a 2024 report identifies a “lack of consent or control over how his or her likeness is used” as “the key factor in determining harmfulness” of deepfakes.

Ahead of the documentary's release, Vicky shared a statement about how she had “wrestled with the decision” to release a “deepfake sex tape online.” She said, “Whilst I know this doesn’t compare to the distress and horror actual victims feel when they discover this content of themselves, I hope it will give some insight into what they go through.”

There's also a moving moment in the documentary when Vicky reflects on deepfake abuse, noting, “This is a global issue of women having their consent taken away – and there's not a thing we can do about it.”

Read More
Why should Scarlett Johansson (or any woman) have to fight to own her voice and image?

The actor has called out OpenAI's new chatbot for sounding “eerily” like her.

Image may contain: Scarlett Johansson, Face, Head, Person, Photography, Portrait, Blonde, Hair, Formal Wear, and Clothing

In her previous documentaries on women's issues, Vicky has explored premenstrual dysphoric disorder and spoken up against online sexist abuse. Throughout My Deepfake Sex Tape, she acknowledges that her experience of consensually releasing a deepfake cannot compare to that of genuine deepfake abuse victims. While the documentary doesn't shy away from addressing this disparity, it never gets close to resolving it.

At one point, Baroness Charlotte Owen, who along with GLAMOUR, is lobbying the government to adopt consent-based legislation around deepfake abuse, advises Vicky not to release the deepfake. Vicky appears to be considering it. The documentary then shows Vicky meeting with an intimacy coordinator to plan the logistics and aesthetics of the tape.

“I've watched a lot of this deepfake stuff in the run-up to this," Vicky says, "and it seems to be like an overarching theme that it's quite graphic and degrading – and actually, I don't want [my deepfake] to be like that at all.”

Genuine survivors of deepfake abuse, it's worth noting, do not have access to an intimacy coordinator.

Image may contain Vicky Pattison Head Person Face Teen Sad Photography and Portrait
Potato/Channel 4

“Vicky created and released the sex tape herself […] She was able to choose exactly how she wished for her “likeness” to be presented,” says Daria*, a survivor of image-based abuse, who spoke with GLAMOUR after hearing the publicity about the planned stunt.

"By contrast, true victims of sexually explicit deepfakes/sexual digital forgeries are simply not afforded the same choices. Rather, they have no choice in the matter, as the films/images have been created and shared entirely without their consent.

"True victims of deepfakes do not have a choice in relation to the sexual activities their “likeness” is engaged in […]

“The right to consent is cruelly snatched from true victims of sexually explicit deepfakes/synthetic nudes, both in the creation and sharing of them. And the harm caused to survivors of such are life-destroying and life-shattering.”

Read More
Why are ‘deepfake porn’ tutorials still showing up in search engines?

A GLAMOUR investigation has found that the UK's most-used search engines are funnelling users towards explicit AI software and how-to guides.

Image may contain: Electronics, Mobile Phone, and Phone

Vicky is clearly sincerely passionate about this issue, but there are moments in the documentary where her care gets lost in translation. Take the scene where she runs an image of her husband Ercan through a so-called ‘nudify’ app – with his consent – to expose how deepfake technology is predominately weaponised against women. The result? An image of Ercan with breasts and female genitalia.

This scene demonstrates how the technology used to deepfake people generally only works on female bodies, suggesting there is a far greater demand for deepfakes of women than men. It's an utterly fascinating issue, which surely warrants further investigation.

Image may contain Vicky Pattison Head Person Face Adult Child Accessories Jewelry and Ring
Potato/Channel 4

The documentary fails to explicitly highlight that another woman's body has likely been used to create the image of Ercan's apparent breasts and genitalia. Nor is this explored earlier in the documentary when Vicky runs an image of herself through a ‘nudify’ app, which Channel 4 doesn't name. There is no interrogation of how the nudify app obtained these images. Many, if not most, nudify apps scrape them from a pornographic website or adult content creators's site – without the pictured woman's consent.

This is a battle that sex workers have been fighting for years. As journalist Camille Sojit Pejcha wrote for Document in 2023, “The problem with AI-generated imagery lies in the fact that to create something “new,” machine learning models must be trained on massive data sets of existing images – meaning that in the case of NSFW images, the faces and bodies of real people are being used to generate artificial porn, and they’re not getting paid for it.”

GLAMOUR reached out to Vicky to ask if she's aware that these apps often use nude images of women scraped from the internet without their consent. She said, "I’m painfully aware of how these apps operate, and it’s horrifying. The technology behind these apps and deepfake pornography is built on exploitation. I firmly believe there are two victims in this horrid abuse.

She continues: "Often, these tools rely on non-consensual images of women, taken or scraped without their permission, to train algorithms. This is exactly why this documentary exists: to highlight how unchecked technology is enabling this kind of abuse and why we urgently need better safeguards, stricter laws, and accountability from tech companies.

“When creating the fake sex tape for this documentary, we made sure to work ethically and consensually. Everything was directed, and produced with actors. No actual nude images were used, and every step of the process was controlled to ensure that consent remained at the heart of what we were doing.

“It’s vital that we address how these apps and technologies operate and demand that governments and tech companies take action to protect people, particularly women, from this kind of violation.”

On 19 January, a clip from Vicky’s deepfake created for the program was posted on X. This clip was created using deepfake technology to artificially transpose Vicky's likeness onto the body of adult performer Ruby Mae, who consented to participate in the documentary in this way. On 21 January, Vicky posted to her Instagram Stories, directing those “curious to see the results” and “just how terrifying this technology has advanced already” to search for the clip.

Image may contain Vicky Pattison Face Head Person Adult Bed Furniture Computer Electronics Laptop and Pc
Potato/Channel 4

In the wake of the video being released, GLAMOUR spoke to Madelaine Thomas, a survivor of image-based abuse and campaigner. Madelaine is the creator of Image Angel, a software that platforms can use to stop non-consensual image-sharing. “I find it deeply offensive to see the process of violation I endured being [consensually] recreated for television,” she tells GLAMOUR.

“It feels like a betrayal to take something so personal, so traumatic, and reduce it to a storyline. No amount of digital manipulation could ever replicate the pain, fear, and sense of violation that I, and so many others, continue to live through […] To see them turned into a spectacle dismisses the reality of what we’ve been through and the lasting impact it has on our lives.”

My thoughts return to a moment in the documentary after Vicky has directed the simulated sex scene that will eventually be released as a deepfake. She reflects, “I'm aware that the situation I'm in is very different to the situation that the victims are in […] I have consented and the victims just don't have that same luxury.”

And therein lies the fundamental problem with this documentary: the stunt was justified on the basis that Vicky would be “fully immersing” herself in this technology – a radical proposition indeed. But understandably, Vicky chooses to have agency over the aesthetics of her own deepfake video. This decision distances Vicky not only from the experiences of genuine survivors, but also from the documentary’s statement mission to “understand the depth of shock and violation victims of image-based abuse feel” (per a Channel 4 press release).

Despite the failure to address this, there are several genuinely compelling scenes within the documentary, such as Vicky's interview with a survivor, Sophie, a mum-of-two who experienced deepfake abuse, as well as a horrific form of image-based abuse in which the perpetrator uploaded ‘semen images’ of her online. Semen images are when a perpetrator takes a person's image and either physically masturbates or artificially adds AI semen on them.

Sophie described the experience as “so degrading”, adding, “whoever they decide is their sexual fantasy… literally at the touch of their fingertips, they've got access to that software.”

She reflects on the police's decision to drop charges against the perpetrator, saying, “I think I broke into a thousand pieces.”

The interview with Sophie appears to have been navigated with sensitivity and care. Her testimony alone would have made for a compelling documentary.

Another powerful moment comes when Vicky visits a “takedown company” which specialises in removing harmful content from the internet. She is informed there are 1700 links to harmful images online – and it's not just the deepfakes video. Vicky discovers in real time that she has been the victim of another form of image-based abuse, in which the perpetrator has shared a video masturbating over a real image of her.

Vicky's reaction is moving and recognisable to any survivor of image-based abuse. She says, “My first thought was that it's my fault because I put pictures of myself in a bikini on my Instagram – even after everything I know, all these brilliant and amazing powerful women I've spoke to, I went and f*cking blamed myself first. That's how deep the conditioning is.” It is a harrowing revelation – and I sorely wish the documentary devoted more time to exploring this little-known form of abuse that impacts so many women

Vicky's right, the conditioning runs deep. But real change happens when survivors can share their stories in a safe, non-judgmental space. As Madelaine tells GLAMOUR, “Instead of recreating, or seeking to experience it for television, resources should have been used to educate the public about the real-life consequences of image-based abuse and the systemic changes needed to prevent it.”

Read More
I was deepfaked by my best friend – the government must learn from survivors like me

The government has announced plans to make it illegal to create explicit deepfake images, but we still need more information.

Image may contain: Triangle

Since the documentary was announced, Vicky has engaged with survivors' criticism. She tells GLAMOUR that she's had messages from “loads of women,” including parents, survivors, and women who were told they “should be flattered” when reporting deepfake abuse to the police.

“Victims have come forward and have told me they are grateful for shining a light on this issue and talking about it on a much bigger stage,” Vicky says.

"I understand that some survivors out there may feel my decision to create a consensual deepfake sex tape could be seen as trivialising their trauma. That was never my intention, and I want to be clear: the purpose of this project is to amplify their voices and draw attention to the devastating impact of this abuse.

“The fake sex tape was created to spark a conversation and help people understand how deeply disturbing and violating it is to have intimate content shared or fabricated without consent. While I know that my experience cannot compare to the real-life horror survivors face, I hope the documentary will raise awareness, push for action, and help create a society that protects and supports victims.”

Channel 4 has also addressed the backlash, telling The Standard, "The documentary hears directly from victims of image-based abuse, to ensure that victim’s stories are at the forefront of the purpose of the programme.

“In doing so, audiences will see first-hand testimony from those who have lived through this issue, thus raising the profile of the problem and the ways in which we must support victims, through better processes enforced on tech companies, government legislation, better education and activism.”

“Vicky aims to demonstrate how simple it is to create explicit deepfake content and raise awareness of how accessible this content is and how it can proliferate online.

“She also wanted to experience, as closely as possible, the feelings, vulnerabilities and concerns that arise when one becomes a victim of deepfake porn.

“The documentary recreates this, in a controlled environment, to exemplify the ease with which this kind of content can spread online and to raise awareness by demonstrating actions people can take should they find themselves a victim of this crime. The choice to make original footage ensures that everyone involved was able to fully consent from the outset.”

*Names have been changed to protect survivors' anonymity and safety.

GLAMOUR is campaigning for the government to introduce an Image-Based Abuse Bill in partnership with Jodie Campaigns, the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.

Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK. You can call them on 0345 6000 459.

Read More
Will 2025 be the year our government finally takes male violence against women seriously?

It’s been six months since Labour came into power, and we're still waiting to see how things will change for women and girls.

Image may contain: Advertisement, City, Banner, Text, Poster, Adult, Person, Bus, Transportation, and Vehicle