We're only a week into 2025, and already, it's shaping up to be a good one. Today, the government has announced new measures to crack down on deepfake abuse, starting by making it illegal to create (not just share) digitally altered, sexually explicit images, known as ‘deepfakes’.
In a government press release, Alex Davies-Jones, Parliamentary Under-Secretary of State for Violence Against Women and Girls at the Ministry of Justice, said, “We are putting offenders on notice – they will face the full force of the law.”
While most survivors and experts alike agree the news is an encouraging step forward, there are still serious concerns about whether or not the offence will be consent-based.
When the previous government announced similar legislation (which did not become law due to the timing of the general election), the offence was determined by whether the perpetrator intended to cause harm or distress to the victim. Campaigners noted that perpetrators could exploit this loophole by claiming they didn't intend to cause harm by creating the image or by saying it was just a joke. A consent-based approach would mean the offence is determined by whether or not the victim consented to the image being made – regardless of the perpetrator's intention.
We're also lacking detail about when the new offence (which will be added to the Crime and Policing Bill) will actually become law, whether it criminalises asking or paying someone else to create deepfake images, and how it will hold tech companies accountable for hosting these images.
Here, GLAMOUR spoke to Justice Minister Alex Davies-Jones about what the new laws on deepfake abuse actually mean for women...
GLAMOUR: Hi Alex! It's so exciting to see that the government is criminalising the creation of explicit deepfakes. Why were you keen to get this over the line so early on in the year?
We are signalling to victims and survivors that enough is enough, but we're not just talking the talk; we're walking the walk. We're backing that up with action that shows that we are not fulfilling our manifesto commitment just because we recognise the impact that this is having on women, girls, and victims more broadly. It is so humiliating. It is degrading, and we will not stand for it. This isn't just banter; this isn't just pornography. This is having a significant impact on women and girls, and the government is taking action.
Can you clarify if this new offence will be determined by whether or not the victim consented, or – as was the case with the last government – is determined by perpetrator intent?
Alex Davies-Jones: We're announcing that we are recommitting our intention to criminalise the creation of deepfake images. This was a manifesto commitment for us. It's something we are taking very seriously, and that policy in that detail is still being worked out, but we can confirm that that will be brought forward in the Crime and Policing Bill later this year.
What we're announcing today is that we're making it a criminal offence to take intimate images without consent. And if you take them for the purposes of sexual gratification or with the intention to degrade or humiliate the individual you are taking that image of, then that will also be a criminal offence.
So, specifically regarding the deepfake announcement, we don't have any information on how that will be determined yet?
No, we're still working out all the details on that, and as soon as we've got it, I will let you know. But yeah. But we are recommitting. It will be brought forward in the Crime and Policing Bill, but we're still working through all the policies on it.
Instagram content
Jodie* is a survivor and campaigner whose best friend asked someone else to create deepfake images of her. Will this new offence cover the solicitation of explicit deepfakes so that survivors like Jodie can access justice?
So, solicitation is already illegal for these offences. It's already been covered under existing legislation. There's no need for us to create legislation there and replicate what is already in existence.
What our new offences will do is clarify. For the taking of the images, we are repealing the existing voyeurism offences that were previously created. We are making them easier. We are strengthening them and going further than before with the new offences. But yes, that has already covered solicit soliciting the imagery.
So, is the solicitation of deepfakes already covered under the law?
Yes, it's already covered. Because we're making [deepfakes] illegal, if you solicit such an image or basically procure somebody else to commit a criminal offence on your behalf, it will be captured under the law. Once we create these new offences, it will become illegal to procure or solicit deepfakes.
A GLAMOUR investigation has found that the UK's most-used search engines are funnelling users towards explicit AI software and how-to guides.

The new offence will be included in the Crime and Policing Bill, which will be introduced when “parliamentary time allows”. Can you be any more specific about when that will be?
I can't tell you that, but it will be soon. It will be soon. This is a priority for this government; attacking violence against women and girls, halving it over the course of a decade, really going after the prolific offenders of this is a priority for us, and that is why we're putting it in this flagship piece of legislation that will be brought forward soon.
Will it be this year?
Yes.
In the government's announcement, Baroness Jones, Technology Minister, said: “Tech companies need to step up too – platforms hosting this content will face tougher scrutiny and significant penalties.” Can you offer any insight into what that scrutiny and those penalties will consist of?
Yeah, of course. This is just one element of it. We can create the law, but ultimately, we need these platforms to step up and stop the abuse from proliferating on their sites. The Online Safety Act makes it explicitly clear that illegal, intimate images are considered 'priority' content. Ofcom will have the power to act if the platforms fail to act and fail to remove this imagery from their sites, so if they fail to comply with their terms and conditions as set out by the regulator, then they can be fined up to 18 million pounds or 10% of their annual turnover, whatever's higher.
There can also be criminal sanctions for individuals who are at the senior manager liability level to ensure that they comply. I know that the regulator and the government will not fail to ensure that these platforms are subject to their duties under the Online Safety Act.
Will this new offence of creating sexually explicit deepfakes be listed as a ‘priority offence’ in the Online Safety Act?
Intimate images are listed as a priority offence under the Online Safety Act. Once [deepfake images] are made illegal, then yes, the platforms will have to act to remove that image from the site. And if they don't, then they fall foul of Ofcom's guidance.
Here's what we know.

As you know, GLAMOUR is campaigning for a comprehensive Image-Based Abuse Bill, which will cover changes to the civil law funding specialist services and creating an Online Abuse Commission to hold tech companies accountable for this kind of abuse. Is this something you'll be working on in the near future?
We are committed to making these new offences for really tackling the issue of intimate image abuse. That's why this announcement that we've brought forward is so significant. We're going further than we committed in our manifesto.”
We see this as a priority. We want to keep women and girls safe wherever they are, school, the workplace, online and offline. This signals our specific intent to do that. We will be legislating for that.
GLAMOUR is campaigning for the government to introduce an Image-Based Abuse Bill in partnership with Jodie Campaigns, the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.
*Names and some details have been changed to protect victims and survivors' identities and safety.
Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK. You can call them on 0345 6000 459.
For more from Glamour UK's Lucy Morgan, follow her on Instagram @lucyalexxandra.
Next up? A comprehensive Image-Based Abuse Law.


