The government has dropped a controversial amendment regarding criminalising the creation of sexually explicit deepfakes, following pressure from Baroness Charlotte Owen, survivors of image-based abuse, and yes, GLAMOUR.
The campaign to #StopImageBasedAbuse – a coalition between GLAMOUR, Jodie Campaigns, the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn – has long been calling for the government to criminalise those who make so-called ‘deepfake porn’ of other people without their consent.
While we're encouraged by the government's commitment to criminalising such images, we must ensure the legislation itself actually works for survivors.
Last week, the Ministry of Justice announced an amendment to the Data (Use and Access) Bill, which would require survivors to prove that the perpetrator intended to cause them “alarm, humiliation, or distress” and/or that they created it for the “purpose of sexual gratification.” This is known as an intent-based approach.
On the other hand, a consent-based approach – which campaigners are calling for – would be based on whether or not the person consented to the image being made – regardless of the perpetrator's intentions.
A GLAMOUR investigation has found that the UK's most-used search engines are funnelling users towards explicit AI software and how-to guides.

Baroness Charlotte Owen had previously tabled her own amendment to the Data (Use and Access) Bill, which adopted a consent-based approach, which has received cross-party support from the likes of Lord Browne of Ladyton, Baroness Kidron and Liberal Democrat peer Lord Clement-Jones.
GLAMOUR understands that the government tabled their own intent-based amendment, having concluded that Baroness Owen's consent-based amendment was incompatible with Article 10 of the European Convention on Human Rights (ECHR), which protects freedom of expression.
Survivors and campaigners alike called out the government's intent-based approach. Cally Jane Beech, a survivor of deepfake abuse and GLAMOUR's Activist of the Year, said the government has “completely missed the mark”, adding, “If they listened to what the survivors have been campaigning for, they would understand that intent to cause harm leaves perpetrators still able to use images of others without their consent.”
Jodie*, a survivor of deepfake abuse and women's rights campaigner, described it as a “missed opportunity to prioritise victims and their lived experiences.”
The government has announced plans to make it illegal to create explicit deepfake images, but we still need more information.

And it seems that the government has listened, sort of: A government source confirmed to The Sunday Times that it would be dropping the intent-based amendment, saying, “The government position was that a consent-based deepfake offence would not be compliant with ECHR on Article 10.
“Ministers went out and spoke to peers, but in the process received further evidence, including from Baroness Owen. Based on that evidence, the government position is now that a consent-based approach would be compliant with the ECHR. That means we will not put our amendment for an intent-based offence forward for a vote this week. And we are minded to pursue a consent-based deepfake offence. We are now working through how to make that work.”
Baroness Charlotte Owen tells GLAMOUR, "I am delighted that the government has changed course and decided to pull its amendment that would have required victims to prove the motivation of their abusers. Something that is almost impossible for them to know, let alone prove.
"We are now one step closer to comprehensive law that protects women from this horrific abuse. I hope the government will back my amendment to the data bill that is vitally consent-based and includes solicitation.
GLAMOUR is campaigning for the government to introduce an Image-Based Abuse Bill in partnership with Jodie Campaigns, the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.
Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK. You can call them on 0345 6000 459.
For more from Glamour UK's Lucy Morgan, follow her on Instagram @lucyalexxandra.
We spoke to survivors and campaigners who feel let down by the government's approach to tackling deepfake abuse.


