Baroness Charlotte Owen of Alderley Edge has introduced a private members bill to criminalise the creation and solicitation of sexually explicit digital forgeries – or ‘deepfakes’ – without consent.
Today (Friday, 13th December), the bill has its second reading in the House of Lords, which is an opportunity for the Lords to debate the main principles of the bill – and flag any potential concerns.
The government has already confirmed that it will criminalise the creation of deepfakes “as quickly as possible”; however, they've yet to identify what this legislation would look like, when it would be introduced, and, crucially, whether it would be consent-based.
GLAMOUR is currently campaigning for a dedicated, comprehensive Image-Based Abuse Law, which would – as a starting point – criminalise the creation of sexually explicit digital forgeries without consent. That's why we're following both the government's progress in this area and Baroness Owen's private members bill.
In an exclusive essay for GLAMOUR, Baroness Charlotte Owen writes about her motivation for introducing the bill, the government's response so far, and what it means for all survivors of image-based abuse.
Image-based sexual abuse is the new frontier of violence against women. It is rapidly proliferating and disproportionately sexist. 99% of sexually explicit deepfakes are of women. They are created using generative AI through easily accessible online platforms, and so-called “nudification” apps easily available on the App Store.
I have been deeply concerned about deepfake abuse for several years and first raised the issue of it in Lords questions back in February after being shocked that the Law Commission report did not believe the level of harm caused was serious enough to criminalise.
I firmly believe that every woman should have the right to choose who owns a naked image of her. However, the gaping omissions in our patchwork of legislation have meant that whilst sharing sexually explicit deepfake content is illegal, shockingly, the creation itself and the solicitation are not.
A Ministry of Justice official confirmed the news earlier today.

Analysis by ‘My Image, My Choice’ found that 80% of the apps launched in the last 12 months alone, demonstrating just how rapidly this abusive market is growing. One app processed 600,000 images in its first three weeks after launch.
Taking and creating an image or video without a woman’s consent is abuse, and I firmly stand with the 91% of GLAMOUR readers who think this technology poses a threat to women's safety.
I have seen successive governments commit to criminalising the creation of this content. Yet, no legislation was detailed in the King’s speech, which set out the government’s agenda for the next Parliament.
In September this year, tired of waiting for the machine of government to realise that tackling image-based sexual abuse cannot wait any longer, I introduced a Private Members Bill to the House of Lords that would make it an offence to take, create or solicit the creation of sexually explicit images and videos without a person’s consent.
My bill is comprehensive, victim-centred legislation that seeks to not only close the gaps in the law but also to provide future proof against the evolution of these harms.
Vitally, unlike previous iterations of proposed legislation, my bill is consent-based, which removes the traumatising and unnecessary burden on the victim of having to prove the intent of the perpetrator.
A GLAMOUR investigation has found that the UK's most-used search engines are funnelling users towards explicit AI software and how-to guides.

The last government proposal in this area meant that victims would have to prove that not only did they not consent but that the perpetrator was creating the content for the purpose of harm, humiliation, distress or sexual gratification. This clearly meant that perpetrators could easily find loopholes such as artistic expression.
Further, the bill will criminalise solicitation to create sexually explicit images and videos without consent. This clause is wholly inspired by Jodie*, who first came to speak to me back in April this year; she is an inspirational survivor whose case highlights to us how important this clause is. She later shared her story with GLAMOUR, too.
Jodie discovered that her private Instagram photos had been posted onto Reddit and other online forums, with the uploader asking users to ‘fake’ her face into pornographic situations. The photos she did find of herself were sexually depraved, depicting her in student-teacher relationships.
Jodie endured this abuse for five years, finding hundreds of pictures of herself, her friends and many other young women. Given the borderless nature of the internet, it is vital that no one can circumnavigate laws by asking for content to be created from jurisdictions where they have not yet legislated.
In partnership with the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.

Under the bill, if a person is found guilty of non-consensual taking or creating an image or video, they will be forced to delete the content. The brilliant Revenge Porn Helpline demonstrated to me how important it is to have clearly defined laws around this area. They worked with a young woman who bravely brought her perpetrator to justice for the non-consensual sharing of her intimate images, and after conviction, the police handed him back his devices with the content still on them. No woman should have to live in fear that her convicted abuser still owns intimate images of her.
Sophie Compton’s thought-provoking documentary Another Body is vital viewing for everyone to understand this issue. The film taught me about the disturbing problem of semen images, where men are taking women’s images and either physically masturbating on them or artificially putting AI semen on them.
Rather sickeningly, this practice is referred to in the online community as a ‘tribute’. Clearly, this is a fundamental violation of a woman’s sexual autonomy, but shockingly, under our current law, if the person themselves is not naked or in an intimate act, for instance, if this happened to one of their Instagram photos, their only recourse would be acommunication or a harassment offence. This is not good enough.
Calling on the wise legal counsel of Professor Clare McGlynn KC, along with the Parliamentary Bill Office, we worked up a clause that would seek to not only bring this appalling abuse into scope in the sexual offences act but also seek to futureproof against the ways in which the forms of abuse evolve by adding into the definition in the sexual offences act the phrase “something else depicting the person that a reasonable person would consider to be sexual because of its nature.”
The bill has the backing of Refuge; Revenge Porn Helpline; My Image, My Choice; Not Your Porn; Jodie Campaigns; End Violence Against Women Coalition and Professor Clare McGlynn KC. It has its second reading in the House of Lords today (Friday, 13th December), and I am delighted that it already has the support of a large number of cross-party peers.
We're calling on the government to take urgent action.

However, I have been hugely disappointed to find that despite this legislation already being passed before Parliament and being advocated for cross-party in both the Lords and the Commons, the government currently indicates that it will not support it. Quite simply, this means that if my bill reaches the Commons, the government may seek to prevent it from becoming law.
The government will only commit to making it illegal to create sexually explicit content, and they have not committed to a time frame around this.
Given the comprehensive nature of this bill already before Parliament that they may not support, victim survivors must get assurances that any future legislation will be:
(1) Consent-based, removing the need to prove intent;
(2) Include solicitation, AKA criminalising those who ask other people to create illegal deepfakes.
(3) Include forced deletion provisions;
(4) Future proofs against the evolution of these harms;
(5) Have a single taking offence, removing the need to prove intent and updating the outdated voyeurism and upskirting offences.
Women are sick and tired of waiting for legislation to protect them from this abuse. The Home Secretary committed to using “every tool available” to take power from abusers and hand it to victims. Why not this one?
She spoke to GLAMOUR about life in government, tackling male violence, and why she has no regrets for calling out Andrew Tate.


