It will soon be illegal to create sexual digital forgeries – or ‘deepfakes’ – of other people, according to new plans laid out by the government.
A Ministry of Justice spokesperson told GLAMOUR, “Sexually explicit deepfakes are degrading, harmful, and, more often than not – misogynistic.
“We refuse to tolerate the violence against women and girls that stains our society, which is why we’re looking at options to ban their creation as quickly as possible.”
Deepfake abuse refers to when a real person’s likeness is artificially mapped onto an image of another real person’s nude body, often engaged in a sexual act. In 98% of cases, neither of these people consent to the digital forgery being made. And in 99% of cases, the victims are women. No wonder a GLAMOUR survey of over 3000 people found that 91% thought deepfake technology poses a threat to women's safety.
Over the past year, GLAMOUR has been calling for the government to act on the scourge of deepfake abuse by making it a criminal office to create – or ask someone else to create – a sexual digital forgery of someone without their consent.
But we didn't stop there; we partnered with Jodie*, a survivor of deepfake abuse; the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn, to demand the introduction of a comprehensive Image-Based Abuse Law covering all forms of image-based-abuse. Our petition, which you can sign here, already has 66k signatures – and counting.
Jodie, who has previously shared her experience of being deepfaked with GLAMOUR, says, “I welcome the news that the government plans to introduce a creation offence for deepfake abuse.
"However, for this legislation to truly protect victims, it must be comprehensive. This means including provisions for solicitation, forced deletion of abusive content, future-proofed language to account for evolving technology, and a consent-based approach so survivors aren’t burdened with proving intent to harm.
"Legislation alone is not enough. We also need robust civil laws to provide survivors with tools to seek justice, better preventive education to stop this abuse before it starts, and increased funding to support specialist services for victims. I urge the government to ensure that any new legislation leaves no gaps and delivers the justice, support, and safety that survivors deserve.”
Instagram content
Professor Clare McGlynn, a leading legal expert in online safety and a GLAMOUR campaign partner, adds, "It is vital that any new law is comprehensive and consent-based to ensure that all abusive images are covered.
“Experience has shown that requiring proof of specific motives makes prosecutions more difficult and excludes many cases of abuse. We must focus on the harms to victims, not the motives of perpetrators.”
Baroness Owen of Alderley Edge, the youngest member of the House of Lords, has also tabled a private members bill to criminalise the creation (and solicitation) of sexual digital forgeries. She says (via The Telegraph), “The problem of sexually explicit deepfakes is one that is inherently sexist and rapidly proliferating.
“They have been described as the new frontier of violence against women. The content is created using generative AI and can be made in a matter of seconds with easily downloadable nudification apps or online platforms.”
Under the Online Safety Act, it is an offence to share non-consensual digital forgeries without consent, but crucially, it's not an offence to create this material in the first place. Hopefully, that's about to change.
Names have been changed.
GLAMOUR is campaigning for the government to introduce an Image-Based Abuse Bill in partnership with Jodie Campaigns, the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.
Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK. You can call them on 0345 6000 459.
The Cyber Helpline provides free, expert help and advice to people targeted by online crime and harm in the UK and USA.
The results of GLAMOUR's Consent Survey are in.



