Search engines are showing guides for creating deepfakes, websites hosting non-consensual porn and suggestions for NSFW AI apps, despite new UK rules and tech platforms moving to tackle abusive deepfakes.
A GLAMOUR investigation has found that Google, Microsoft’s Bing and Yahoo search – the most-used search engines in the UK according to Statista – are funnelling users to pages of tutorials explaining how to make deepfake porn using face-swapping technology, purported explicit AI software and “undressing” apps in just a few clicks.
As deepfake abuse has increasingly broken into the mainstream, digital platforms and the UK government have introduced policies and practices to curb abusive content. But online abuse experts and survivors’ advocates say changes have been too slow, with the findings showing actions are falling short of properly dealing with non-consensual porn, created and shared by a web of actors online.
“It’s too little, too late,” said Professor Clare McGlynn, a leading expert on tech-facilitated abuse. “There is an entire ecosystem around the creation and distribution of sexually explicit deepfakes that has been allowed to proliferate.”
Search engines are included under the Online Safety Act, which criminalises the sharing and distribution of deepfake porn and can result in a two-year sentence and being added to the Sex Offenders register – if intent to cause harm or distress is proven. But it’s so far unclear how the regulator Ofcom will treat them, McGlynn said.
The previous government also announced an amendment to the Criminal Justice Bill in April, making the creation of deepfakes punishable with a criminal record and unlimited fines. This prompted two popular deepfake porn sites to block access in the UK. However, they are still accessible via search engines with a VPN. And after the general election was called, the government's amendment fell – we've yet to see whether Labour will revisit it.
Hundreds of listings across the search engines for free deepfake porn-making software, NSFW AI apps, guides and forums are also easily found with simple searches. Not all uses will be abusive, but the tools make it potentially possible for anyone to mock up explicit images of a celebrity, ex-girlfriend, or anyone else with simple imagery of their face – with or without their consent.
Will you sign our petition?

Google, Microsoft and Yahoo don’t allow the sharing or creation of sexually intimate images of someone without their permission and have reporting and take-down procedures for victims.
Google announced new measures to crack down on deepfakes in July, promoting news articles and non-explicit content over AI-generated content itself.
Microsoft also outlined its approach in July to combat abusive deepfakes, and a partnership in September with StopNCII for a tool for people to protect their images on Bing.
But GLAMOUR’s investigation and the current ability to find deepfake material with search engines highlights “gaps and errors” in efforts, said McGlynn. They are reacting on an “ad hoc basis to claims being made and arguments for them to change things,” she said. “The genie is out of the bottle and they’re responsible for that,” McGlynn added.
Searches on 19 November for two specific websites known for hosting deepfake ‘porn’, including one that has blocked access in the UK but is visible with a VPN, surfaced the sites – boasting AI celebrity porn and nudes – at the top of all the search engines’ listings.
For searches for deepfake porn more generally, all three platforms showed news reports on the phenomenon, its threat and consequences for victims, illustrating moves to take action against the issue at the tech companies since attention to the problem hit the mainstream with high-profile cases, such as Taylor Swift.
However, sites purportedly showing celebrity deepfake porn also showed on pages two, three and six of Google searches for “deepfake porn” on 19 November.
Campaigner and influencer Cally Jane Beech is being honoured as GLAMOUR’S Activist of the Year at our annual Women of the Year Awards for courageously taking a stand against digitally altered, sexually explicit ‘deepfakes’ of women and girls. Here, she speaks to GLAMOUR about her experience of deepfake abuse, how motherhood influences her activism, and why she’s calling on the government to protect all survivors of image-based abuse.

Searches for creating deepfakes also showed tools and guides for making them on all three platforms, including 12 free “deepfake porn maker” tools on one link alone.
Some deepfake software requires computing knowledge and processing power, taking weeks or even months to master. And of course, not all deepfakes are explicit, non-consensual or unlawful. But the search engines make how-to guides for all deepfakes more accessible.
Search engines promoting access to the tools play a “crucial role” in facilitating deepfake abuse creation and its audiences, said Elena Michael, co-founder of NotYourPorn, which campaigns against online image-based sexual abuse along with survivors.
"Nobody stands a chance in stamping out deepfakes when there are hundreds of listings,” said Michael of the many tools and guides to make and view deepfakes, sites recommending the tech and forums discussing deepfake abuse. “These listings are accessible to anyone and everyone,” she added.
Recent comments by technology secretary Peter Kyle that tech giants including Google and Microsoft should be treated like nation states fall short of pushing for accountability of platforms, Michael added.
“The messaging to survivors and to women is that a company’s right (search engines included) to make money is more important than your right to exist freely and safely both offline and online.”
A government spokesperson said: “Under the Online Safety Act, it is already an offence to share or threaten to share intimate images, including deepfakes, without consent. Earlier this month we strengthened the Act to make it clear that platforms will have to prioritise tackling deepfake intimate image abuse, proactively remove more of this material, and stop it from appearing in the first place.
“We are committed to strengthening the safety of women and girls on and offline which is why we are determined to deliver on the manifesto commitment to ban their creation as quickly as possible.”
A spokesperson for Google said: “We’ve had long-standing policies to address deepfakes in Search, and we recently rolled out industry-leading updates to make it easier for people to remove this content from Google and to help keep it from appearing high up in results. We’re continuing to engage with victim-surviviors and we’re actively developing new solutions to help people affected by this content.”
A spokesperson for Microsoft said: “When content is reported to Microsoft, the company investigates and takes appropriate action. Microsoft also realises more needs to be done to address the challenge of synthetic non-consensual intimate imagery, and remains committed to working with others across the public and private sector to address this harm.”
A Yahoo spokesperson said, “Today, our search results are powered by a number of sources, including Microsoft Bing, and we regularly work with our search partners and other experts to take actions against results that contain harmful content.”
If you have had your intimate images shared without your consent, remember that you are not alone, and there is help available. Get in touch with the Revenge Porn Helpline at help@revengepornhelpline.org.uk. There is also a step-by-step guide on notyourporn.com, which should be followed before taking any action.
GLAMOUR is campaigning for the government to introduce an Image-Based Abuse Bill in partnership with Jodie Campaigns, the End Violence Against Women Coalition, Not Your Porn and Professor Clare McGlynn.
In partnership with the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.

