Survivors and campaigners are disappointed by the government's latest update on how it will tackle deepfake abuse.
Yesterday (22 January), the government tabled an amendment to the Data (Use and Access) Bill that will criminalise intentionally creating a sexually explicit deepfake without consent. On the surface, this appears to be a step in the right direction. But if you look closer, the proposals leave a lot to be desired.
Why? Well, to secure a conviction against someone who has created a deepfake of you without your consent, under the new proposals, you must prove that the perpetrator intended to cause you “alarm, humiliation, or distress” and/or that they created it for the “purpose of sexual gratification.”
But why should survivors, who know they didn't consent to this horrific imagery being made, have to prove the perpetrator's motivation? Shouldn't their lack of consent be enough to warrant a conviction?
Under the latest proposed legislation, those found guilty of creating deepfake images without consent and causing harm or receiving sexual gratification face an unlimited fine – rather than jail time.
Jodie*, a survivor of deepfake abuse and women's rights campaigner, describes the legislation as a “missed opportunity to prioritise victims and their lived experiences” She argues that Baroness Charlotte Owen's Private Members Bill, previously dismissed by the government, was a “consent-based proposal” which recognised that it's “often impossible for survivors to prove a perpetrator's intent”.
She says the proposed amendment will “leave countless victims without the justice they desperately need and deserve.”
Jodie's statement in full:
“The government’s amendment to the Data Bill is a missed opportunity to prioritise victims and their lived experiences. The consent-based proposal from Baroness Owen, which is centred on survivor experiences, offers far greater protections, recognising that it is often impossible for survivors to prove a perpetrator’s intent. A motivation and consent-based law, which the government is now proposing, will leave countless victims without the justice they desperately need and deserve.
There is also an urgent need for clear guidance on solicitation offences. Too often, perpetrators outsource the creation of deepfake images, and under this proposed legislation, victims could face cases being dismissed by the police and CPS. The law must be clear – anything less risks leaving victims vulnerable.
Equally concerning is the decision to reduce this crime to a fine. Deepfake abuse is a form of sexual violence that causes profound harm to its victims. It should be treated with the seriousness it warrants, including the possibility of prison sentences for the most severe cases. Justice demands that we prioritise survivors, uphold their autonomy, and enact laws that deter this devastating abuse.
Tackling deepfake abuse requires more than just legislation. It demands a holistic approach. This includes better funding for support services like the Revenge Porn Helpline, which plays a critical role in helping victims remove damaging content. We also need comprehensive preventative education in schools and targeted public awareness campaigns to reach those outside formal education.
It’s crucial we avoid over-criminalising, particularly young people, but we mustn’t shy away from recognising this crime for what it is – sexual abuse. Laws should reflect the severity of the harm caused while ensuring survivors have access to the justice and support they deserve.”
Cally Jane Beech, a survivor of deepfake abuse and GLAMOUR's Activist of the Year, said the government has “completely missed the mark”, adding, “If they listened to what the survivors have been campaigning for, they would understand that intent to cause harm leaves perpetrators still able to use images of others without their consent.”
She continued: “Consent has been at the forefront of the hard work of the survivors, whether you can prove it was intended to cause harm or not, and as a deepfake survivor, I feel that this is not good enough – it's, yet again, leaving loopholes within the law and leaving women and girls unprotected and still vulnerable to image-based abuse.”
The government's amendment comes after Baroness Charlotte Owen submitted an amendment of her own, which calls for the creation of sexually explicit deepfakes to be a consent-based offence, i.e. one which doesn't rely on proving the perpetrator's motivations. The amendment has cross-party support in the House of Lords, including from senior Labour figure Des Browne, the Rt Hon. the Lord Browne of Ladyton.
“It is too great a burden on police, prosecutors and survivors to prove an individual's intention to humiliate, harm, distress or gain sexual gratification,” Baroness Owen tells GLAMOUR. “It should be enough that a woman did not consent to her image being used in this way. The non-consensual creation of sexually explicit content is an act of abuse.”
Women are tired of waiting for legislation to protect them from image-based abuse.

The Law Commission's report on intimate image-based abuse recommends implementing a base offence in which it's illegal to intentionally take or share an intimate image without consent – regardless of the perpetrator's motives. The fact that it was taken or shared without consent should be considered “sufficiently wrongful and harmful to warrant criminalisation.”
The report further recommends that a perpetrator could be charged with a more serious offence if they took or shared the image to cause harm and/or for their sexual gratification. But, under a consent-based law, the victim could still push for a conviction even if they can't prove the perpetrator's motivation.
The government is well aware of the distinction between consent-based and intent-based offences. When former GLAMOUR Woman of the Year Georgia Harrison testified in court against Stephen Bear for sharing a sexual video of her without consent online, she also had to prove that he intended to cause her distress. “Anyone sharing unconsented sexually explicit footage should know it will cause distress – because it does,” Georgia previously told GLAMOUR.
Thanks to Georgia's campaigning – along with the vital and oft-overlooked efforts of organisations like the Revenge Porn Helpline, EVAW, and Not Your Porn – the government eventually removed this requirement, meaning that sharing intimate images without consent is a criminal act, regardless of motive. So why are survivors and campaigners having to fight for the same logic to be applied to the creation of deepfake images?
GLAMOUR understands the government is concerned that a consent-based offence may inadvertently criminalise children who create such content accidentally or without an informed understanding of their actions and that intent-based offences, such as theft and homicide, already exist in the law and can be prosecuted more effectively. We also understand that new legislation will be introduced to ensure that those who ask people in other countries – where it's not illegal to create deepfake images – will still be charged under the UK's deepfake laws.
The government says the new offence has been “carefully designed to apply to new developments in technology and to provide the best protection to victims”.
Justice Minister Sarah Sackman KC says, "Sexually explicit images created without consent constitute a fundamental violation of women’s autonomy and dignity.
“As part of its Plan for Change, this government is determined to clamp down on this degrading and chauvinistic behaviour.”
GLAMOUR is campaigning for the government to introduce an Image-Based Abuse Bill in partnership with Jodie Campaigns, the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.
Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK. You can call them on 0345 6000 459.
For more from Glamour UK's Lucy Morgan, follow her on Instagram @lucyalexxandra.
Next up? A comprehensive Image-Based Abuse Law.




