'I've seen boys request fake nudes of their teachers and mothers': How nudify apps are violating women and girls in the UK

We need comprehensive image-based abuse laws – and we need them now.
'Nudify' apps are still being weaponised against women and girls in the UK

“It was a lazy Sunday afternoon and I was brewing some tea and decided to check my emails. I’d just received an email with the subject line: "Becca – Photoshoot." I assumed the email was linked to [a work] project and clicked.

The first image that appeared was an old photo of me, one I recognised from my literary agent's website. However, something seemed off. As I scrolled down, it appeared I was wearing nothing at all. I distinctly remembered wearing a long-sleeved black top in the original photo. The bare shoulders I was now seeing made no sense.”

Becca, a British tech journalist in her thirties, is one of many women who’ve had their photos run through a nudify app without consent. Becca received an email containing AI-generated nudes that had been created using an “undress” app. She then received another email threatening to distribute the images to friends, family and professional contacts unless she paid a hefty sum in Bitcoin within 12 hours.

“The message highlighted the potential damage to my personal and professional reputation as well as my mental health. I was very confused and couldn’t quite make sense of what I was seeing. I immediately wanted to cry at seeing my face, my happy photos being used in this way. I’ve had scam emails before, but this felt way more threatening coupled with the photos.

“That was my face, but definitely not my body,” Becca tells GLAMOUR.

Read More
We're calling on the next government to protect women and girls from image-based abuse

In partnership with the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.

Image may contain: Advertisement, Poster, Person, Accessories, Bracelet, Jewelry, Adult, Head, Face, Concert, and Crowd

Nudify apps are currently legal to download and use in the UK, and they’re still too easy to access.

Research by McAfee shows that in the last 12 months, 27% of internet users have had their likeness used to create sexually explicit content and that people are increasingly concerned about their images being used without consent. Indeed, 91% of GLAMOUR readers agree that deepfake pornography poses a threat to women's safety.

“We're always learning of new ways these evolving forms of abuse are used against victims.”

A quick Google search will throw up countless iterations of “undress” apps with taglines promising to “make men’s dreams come true.” And there’s no subterfuge here. These apps have a clear purpose and they’re making no efforts to disguise that purpose. In fact, they’re encouraging users to upload photos and “undress anyone you like in seconds”.

“We are always learning of new and horrific ways these evolving forms of abuse are used against victims. Despite this, I have been entirely unprepared for the deep despair I felt learning how easily Nudify apps would facilitate the creation of digitally altered nudes and sexually explicit deepfakes.” says Umaymah, a campaigner at Not Your Porn.

While the App Store and Google Play Store do make it hard to access these apps – they set strict rules regarding pornography and objectification – Nudify apps can still easily be downloaded using plug-ins or via the individual app’s website.

While some sites are blocked in the UK, one need only download a VPN to get around this. Some apps also give users the option to create images depicting bondage and sex acts.

It costs less than £10 to create 25 images on Clothoff, the app that recently made headlines after being used by students at schools in Spain and the USA to create and share nonconsensual images of classmates. While the site is blocked in the UK, the landing page contains the following message: “Oops 👀 Access to the service in your country is blocked. But if you’d login from other country using special services, magic is right here 😉.”

'Nudify' apps are still being weaponised against women and girls in the UK

In the UK, sharing deepfake content of this kind is a criminal offence punishable by up to two years in prison per the Online Safety Act, but making these images and storing them on a personal device is still totally legal.

Still, there were hopes that creating synthetic nudes and sexually explicit deepfakes would be made illegal in the United Kingdom. It was announced in April that there would be changes to the Criminal Justice Bill, including new offences for creating this kind of content. However, the amendment was criticised for not covering consent in any way and focusing instead on intent to cause harm.

Regardless, any regulation for these Nudify apps and creating sexually explicit synthetic content in the UK has been stopped in its tracks, following the dissolving of Parliament on 24 May 2024 after Prime Minister Rishi Sunak called a general election. Now, the amendment to the bill and any subsequent changes to the law around intimate image abuse have been paused, which is why GLAMOUR is campaigning for the next government to introduce an Image-Based Abuse Act. You can read more about the campaign here.

Read More
Deepfake technology is a threat to all women – not just celebrities

We're calling on the government to take urgent action.

Image may contain: Taylor Swift, Accessories, Blonde, Hair, Person, Jewelry, Necklace, Formal Wear, Face, and Head

“I first learnt about these apps when I was searching for my own leaked content in a forum online and saw requests for deepfake nudes in there,” says presenter and women’s safety campaigner, Jess Davies.

“This was back in 2021, and already, the apps have improved in quality with the images they produce. As they are run by AI technology, the more people use them, the more accurate they will become. One Nudify app boasts about having over 100,000 users a day, so you can see how the technology is improving.”

Research from Home Security Heroes revealed a 550% surge in deepfake videos online in 2023. Nudify apps, which are easy to download and cheap to use are at the centre of this swell and there is currently no way of monitoring consent. Some “undress” apps state in their terms and conditions that it is crucial to obtain explicit consent of the individual depicted before proceeding to use the service, but not a single one of these apps require users to prove they have obtained consent in any way.

What’s more, sites like X and TikTok regularly host ads for these apps in the UK, with ads appearing in users' feeds without engaging with any type of related content. Elsewhere on the internet, Reddit, Facebook, and Quora host threads where users discuss the best types of nudify apps, the best features, and value for money and freely debate the efficacy of undress apps, sharing links and critiques with one another.

And despite removing all AI-generated content from its site, digital behemoths like PornHub still proffer search results like “Nudify porn”, “deep nude app videos” and “watch free nudify videos online”. PornHub has also come under fire for hosting videos by creators advertising apps like Clothoff, but confirmed that it removes any content of this kind and that these ads are not permitted on its sites.

Read More
We're calling on the next government to protect women and girls from image-based abuse

In partnership with the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.

Image may contain: Advertisement, Poster, Person, Accessories, Bracelet, Jewelry, Adult, Head, Face, Concert, and Crowd

Whether promoting or platforming these apps in the UK, tech companies are essentially enabling sextortion and intimate image abuse. A 2023 report from the Revenge Porn Helpline shows that sextortion cases have increased by 54% compared to the previous year, with 28 times more women affected than men.

“The Helpline has observed a growing trend with the emergence of AI technology in publicly accessible apps, allowing users to create realistic synthetic images quickly and easily. This harmful use of technology presents a fresh risk and form of intimate image abuse, demanding proactive measures to prevent the exploitation of AI technology for such purposes.” said a spokesperson.

“It’s concerning as it’s easier than ever to access these sorts of apps and more and more are being created,” says Becca. “But tech platforms need to take responsibility, they may not be making these deepfake tools, but they are giving people a way to discover and use them. This kind of blackmail, especially sextortion, really thrives on silence and shame. So I’d advise people to talk to someone, anyone. If not a close friend or family member, one of the helplines that now exist for this kind of scam.”

“You might know it isn’t a real image but that doesn’t mean anyone else will believe it is fake.”

Becca chose to take to social media and voluntarily shared the images attached to the email she’d received from her blackmailer. “I wanted to take any of the power away from the scammer. I didn’t like feeling threatened and felt like I had two choices: hide away or say f*ck you.”

“I also wanted to show other people that this can happen. The more I’ve learned about other people’s experiences, the happier I am that I shared and talked about it right away. I’ve had people tell me they’re glad I shared because they’ve had conversations with their kids and teens and other people have said seeing me go through it has been good to consider for if anything ever happens to them – they said it would feel a little less scary and shocking.” she tells GLAMOUR.

“Society stigmatises women for their sexuality; therefore, the damage these images can do can extend to things like being fired, preventing you from being hired or fall-outs with partners and parents. You might know it isn’t a real image but that doesn’t mean anyone else will believe it is fake.” explains Jess.

“We cannot afford for our laws to fall so far behind when it comes to technology because it is women and girls who are harmed. The internet is a gendered experience and we need laws that protect women and girls online.”

Data from the Home Security Heroes report also showed that 98% of all deepfake content in 2023 was of a sexual nature and 99% of the people targeted in that content were women. This is an important detail when it comes to “undress” apps. The technology used to generate these images is trained to create nude images with breasts and vulvas, so feeding the apps a photo of a clothed cisgender man will still result in an AI-generated nude with a vulva and breasts.

“We are facing a future where every woman will have a fake nude image of her existing online.”

“We are facing a future where every woman will have a fake nude image of her exist online, and that should never be normalised,” says Jess. “I have seen boys request fake nudes of their teachers and mothers online. The ease of access of this technology means men and boys can see anyone they desire naked and I worry about the entitlement over women’s bodies that could spill over into our physical world.”

Umaymah explained to GLAMOUR, “By no means does the term “deepfakes” reflect the seriousness of the creation of such images. A better term is “sexually explicit digital forgeries”.

The term “forgery” has been used to both reflect the non-consensual nature of such images, as well as to communicate the fraudulent nature of the creation of such. The language we use to describe the creation of such content is vital for victims (and society) to understand the true nature of this situation.”

So what can be done if the law doesn’t prohibit the use of these apps and the apps themselves don’t monitor consent? “In answer to the inevitable, ‘just don’t post photos of yourself online then!’ this is victim blaming.” says Becca. “Almost everyone has a social media presence and it’s a personal choice, of course, but I don’t want to hide.”

“If you receive a threat that images will be created, it may still be wise to report the matter to the police,” explains Umaymah. “We know the creation of the images isn't yet a crime, but in some cases, threats to create such images may be accompanied by conduct, which may amount to other crimes, including those relating to harassment or blackmail.”

Ruth Peters, director at Olliers Solicitors recommends contacting the apps directly to raise a complaint.

“The app’s complaint policy should be available on their website and it is likely that if an image is created without your consent the individual who has made such contact would be in breach of their terms and conditions. If you have made a complaint directly and are dissatisfied, you can then notify Ofcom. Ofcom is the regulator for online safety in the UK under the Online Safety Act and their job is to make sure that UK established video sharing platforms have in place appropriate measures to protect users from videos.”

However, given how little responsibility apps like this are taking for the wellbeing of those affected and the choice of promotional messaging on the apps themselves, it’s hard to say whether any direct complaints might be taken seriously.

Meta, Pornhub, OnlyFans and TikTok have joined initiatives with charities and organisations that focus on women's online safety, and Reddit has, it claims, hired more staff to remove harmful content. Similarly, Meta, Google, X, TikTok, Snap and Microsoft have teamed up with OpenAI to launch a ‘Tech Accord’, to create tools like watermarks and detection techniques to spot and label deepfakes and AI manipulated images, videos and audio. But there is no blanket ban by any means. The content remains searchable and shareable.

As long as “undress” apps are legal, anyone can create content with them and until UK law reflects clear consent and both AI developers and web hosting platforms are held to account, it’s unclear how pervasive the use of this technology might become.

Read More
I was violated by deepfake pornography. But I won't be shamed for it

Cally Jane Beech reflects on her deepfaking ordeal (and how to prevent it from happening to anyone else)

Image may contain: Clothing, Coat, Long Sleeve, Sleeve, Blazer, Jacket, Accessories, Bag, Handbag, Adult, Person, and Jewelry

Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK. You can call them on 0345 6000 459. The Cyber Helpline provides free, expert help and advice to people targeted by online crime and harm in the UK and USA. IWF can help identify and remove global online child sexual abuse imagery.

GLAMOUR has contacted X for comment. Pornhub refused our request to comment on this article.