From Meta Glasses to smartwatches, the AI revolution is every woman’s worst nightmare

From location tracking and monitoring software to wearable tech like Meta’s AI Glasses and smartwatches, each advancement in tech innovation births another tool for perpetrators to abuse women. Jess Davies investigates for Glamour.
Image may contain Accessories Sunglasses Adult Person Glasses Wristwatch Head Face Wedding Jewelry and Necklace
Getty Images; Collage: Nicola Neville

This article references image-based abuse and domestic abuse.

Grasping red plastic cups that swirled with cheap supermarket vodka and flat lemonade, we fired up Chatroulette on an old, clunky Dell laptop in the kitchen of our student halls. A game of chance. Red or black. Penis or no penis. Each click of the space bar instantly connected us, a group of first-year students, with a stranger’s webcam from across the globe. Some were pre-gaming before a night out, like us. Others asked curious questions about university life in the UK. Rumours swirled that some lucky users had randomly connected with the Jonas Brothers on the platform. But sometimes – pretty much all the time – the space bar teleported us into a dingy, poorly lit room where a man in a headset was manically masturbating.

A technology designed to connect people had been hijacked by men whose arousal was spurred on by a lack of consent. Fuelled by a sense of power and control. It was 2011, the digital revolution was in full swing, and over the next fifteen years, my naivety around technology was replaced with anger as each development in the tech world formed a new tool of abuse for perpetrators. Every ‘miracle’ in computing would inevitably, eventually, become every woman’s nightmare.

Refuge, the UK’s largest specialist domestic violence organisation, has revealed that referrals to its Technology-Facilitated Abuse and Economic Empowerment team rose by more than 62% in 2025 compared with the previous year. The charity has seen a rise in the weaponisation of wearable technology, including Meta Glasses and smartwatches, alongside the misuse of location-tracking apps to monitor and control women. The final three months of 2025 marked their highest quarterly total of tech-facilitated abuse referrals on record, highlighting how rapidly evolving technology is being exploited to perpetuate harm.

Read More
5 women leading the fight against image-based abuse

Today – and every day – the women fighting for a safer future deserve their flowers.

article image

Charlotte Meijer, the founder of Open Justice for All, is a campaigner-survivor of coercive control and tech-facilitated abuse. Her ex-partner introduced Find My, Apple’s built-in location-sharing and device tracking technology feature, early on in their relationship under the guise of being a “fun” way for the couple to stay connected. “Because I could see where he was as well, it was like you know, just so we know where we’re both at”, she explained.

But after Charlotte changed jobs and relocated to a different office, she began to feel increasingly uncomfortable about the obsessive tracking by her perpetrator:

“He would track me while I was at work and then question why I wasn’t in the office”, she said. Her ex would often accuse her of hanging out in a pub with male work colleagues; something she “wasn’t allowed to do.” Charlotte shared how the app's tracking system would sometimes incorrectly identify her location, which led to more harm:

“He basically said, ‘You're not in the office, you're in a pub, who are you with?’? But I was still in the office. It became a massive argument because he would say ‘Your location says you're not there’, but I worked in one of those buildings, which had like six floors and multiple areas and stuff, so sometimes it would place me just outside it where there were a bunch of pubs.”

The technology's inaccuracy raised serious concerns and consequences for Charlotte. “In the small margins of error, there could be really grand consequences for people. The way that I was punished because he thought I was somewhere I wasn't supposed to be is a clear example of that.”

The surveillance of Charlotte continued, with her perpetrator questioning her every move: why she would be visiting friends, where she was out for dinner and drinks and ultimately reminding her that he always knew her location. Explaining how the abuse altered her daily routine, she shared:

“It made me feel very uncomfortable because I felt like I was just being watched and monitored the whole time. It made me anxious and scared because if I was somewhere where he didn't think I should be, or if he thought I was with someone who he didn't approve of, I knew I'd get the wrath of that later.” She adds, “By that time, I definitely couldn't remove it (the app). I very much had to have it on my phone because I'd face the consequences if I didn't.”

Read More
What happened when Glamour went to 10 Downing Street 🪧

For the past three years, Glamour has been raising the alarm on image-based abuse, partnering with the End Violence Against Women Coalition, Jodie Campaigns, Professor Clare McGlynn, and Not Your Porn to call for a comprehensive Image-Based Abuse Law.

Image may contain: Jhane Barnes, Advertisement, People, Person, Clothing, Coat, Adult, Overcoat, Accessories, and Glasses

Charlotte Archibald is a manager at a frontline domestic abuse and sexual violence service in the Vale of Glamorgan, whose service has seen a 30% increase in survivors disclosing tech as a means of abuse over the last six months. Explaining how these new forms of harm often slot into classic patterns of coercive control, she said, “Abusive partners have always used tactics like following, checking mileage, or interrogating someone about where they’ve been. Location sharing, AirTags, car apps, and account logins automate that monitoring. The abuser no longer has to be physically present to track movements, which reinforces the message: “I can see you anytime.” That creates hypervigilance and discourages independence.”

Mina, a survivor of tech-facilitated abuse, was tracked by her abuser after leaving her smart watch behind. Her perpetrator used her linked cloud accounts to track her to the emergency accommodation she had fled to. Speaking to Refuge, Mina shared:

“Realising I was being stalked through Cloud accounts linked to my smartwatch was deeply shocking and frightening. I felt suddenly exposed and unsafe, knowing that my location was being tracked without my consent. It created a constant sense of paranoia; I couldn’t relax, sleep properly, or feel settled anywhere because I knew my movements weren’t private.”

Read More
16 ways that you can help stop violence against women and girls

Gender-based violence is not inevitable.

Image may contain: Banner, Text, Adult, Person, Parade, Face, and Head

As technology evolves at breakneck speed, so too do tools for tracking and surveillance. Spyware, also referred to as stalkerware, is malicious software installed on a device without the user’s knowledge to monitor and record their activity, which is then shared with a third party. This can include location tracking, screenshots and access to private messages. When I searched the term “spyware track” on Google, the top result was a sponsored warning from the National Crime Agency outlining the criminal consequences of using spyware without consent. Encouraging, I thought. But directly beneath it sat three sponsored ads for independent spyware companies, reading:

“See Who They’re Texting: Like having their phone.”

“Track phone without permission.”

“5 Best Untraceable Spying Apps- It’s like holding their phone.”

Google confirmed that it investigated the above ads and took enforcement action when evidence of a policy violation was found.

In the UK, installing spyware on someone’s device without informed consent is illegal under the Computer Misuse Act 1990, and perpetrators who use it as a tool of coercive control can also be prosecuted for stalking and harassment. While many spyware products claim to be designed for parental monitoring – with parents legally able to consent on behalf of children – the Information Commissioner’s Office still recommends that children are informed when they are being monitored. Crucially, there are no meaningful safeguards preventing this technology from being used to stalk or control adult victims.

On at least two spyware websites I visited, the software was openly marketed to “couples”, framed as “accountability apps” designed to “catch a cheater”. In 2025, a security researcher compromised three popular stalkerware apps, exposing the data of more than two million users. Many of them paying customers. Digital stalking, it turns out, is a lucrative business with deeply troubling ethics. One spyware website that appeared in Google’s sponsored results received nearly 264,000 visits in December 2025 alone, an increase of 70,000 since November. The ad spend, it seems, is paying off.

A Google spokesperson told Glamour, “Protecting users is our top priority, and we have strict ads policies that govern the types of ads and advertisers we allow on our platforms. We continue to invest significant resources to stop bad actors, and we are constantly evaluating and updating our policies and improving our technology.”

Read More
Zara McDermott: ‘It’s time we, as women, reclaim technology’

Technology-facilitated abuse is on the rise, yet the digital world also holds the power to create a safer future for women and girls. We need to fight for the latter, says Refuge ambassador Zara McDermott.

Image may contain: Cynthia Kirchner, Clothing, Dress, Blonde, Hair, Person, Adult, Formal Wear, Grass, Plant, and Fashion

While some perpetrators operate in the shadows of malicious software, others are increasingly brazen in public. In recent months, social media has been flooded with videos of women filmed non-consensually by men wearing Meta’s AI Glasses. With a camera built into the frame, the glasses allow for discreet recording without the obvious presence of a phone. When recording, the glasses activate a flashlight function designed to make people aware they’re being filmed, but a quick search on Reddit and YouTube provides step-by-step tips on how to cover the flash, hide it or disarm it altogether.

One Reddit user seeking advice wrote, “My horny friend’s interested in spying on girls that are out of his league. What’s the best way to hide the flash when he’s horny and wants to start recording?” Another attempted to justify covering the light with black nail polish, claiming, “The ‘privacy’ issue doesn’t exist; there is no expectation of privacy in any public place.” The sentiment is echoed – largely by men – in the comments beneath TikTok videos calling out non-consensual filming. As with the use of AI tools to digitally undress women without consent, the harm is easy to dismiss when you’re not the one being targeted.

Molly, whose name has been changed for anonymity, was filmed in the street non-consensually by a stranger using Meta Glasses who approached her while she was feeding her dog. Recalling how the interaction seemed innocent enough, Molly, a full-time content creator, told me how she had shared her Instagram username with the man when he asked, “out of politeness more than anything.”

She was unaware that the interaction was being filmed until a male user tagged her in a TikTok video the next day:

“I watched the interaction back and felt this real cold chill over me. I felt hot and cold at the same time. Even though the interaction itself was absolutely fine, I still felt as though something had been taken from me. It felt really violating.” Molly explained how the video included private details about her location that the man had now broadcast publicly, “What was worse was that I had told the guy where I lived, as he had asked me. There were comments from men saying he should have followed me home, telling people to look out for me in that area and that everyone should move to the area I mentioned.”

When Molly called out the perpetrator in the comments section for filming her without consent, she was met with a slew of abuse from other TikTok users who accused her of lying, saying “there’s obviously a flashlight you’d have seen” and stating she “probably had a boyfriend and had been caught out.”

Read More
X’s ‘Grok’ created an AI sexualised image of me without my consent

“I contemplated stopping posting images of myself but I didn’t want to let the misogynistic trolls get their way."

Image may contain: Electronics, Mobile Phone, Phone, and Person

Anika experienced a similar interaction when a man approached her on a night out while standing outside a venue with a friend. They were unaware that the man was filming them as he commented on what they were wearing, an interaction Anika described as “pretty creepy”. Days later, someone they knew came across a video of the incident on Instagram with Anika stating, “It immediately made me incredibly uncomfortable; I was being used for content without even knowing the video existed”.

They didn’t notice any flashlight during the incident, with Anika believing the man had intentionally covered the recording light to prevent them from noticing they were being filmed. Explaining the content they discovered on the Instagram account, they said:

“Every video on his Instagram page is of young women and femme-presenting people outside bars, nightclubs, and music venues who are too intoxicated or distracted to notice he's recording as he hits on them. A large number of the videos are focused on the subjects' clothing or bodies, with some clips even zooming in on their chests or commenting on how physically attractive they are. In my opinion, this is clearly a way to exploit women's bodies for views, likes, and other engagement online, without their consent.”

When Molly reached out to the man who filmed her to ask him to remove the video, his behaviour changed:

“Suddenly, he snapped into his online persona; he was different to the guy I spoke to. I asked him to please remove it as I had no idea he was filming, and I was worried for my safety based on the comments around where I lived. His response was “Don’t pretend like you’re not keen’.” Soon, the perpetrator followed up on his message, saying a viral meme account had reached out and asked to repost the video, telling her “I said yes I need the clout baby sorry x”. He was trying to cash in on her lack of consent.

Molly, alongside her friends, reported the video to TikTok but said she was “met with a dead end”, with the app failing to take down the video or respond to her reports. After Glamour made TikTok aware of the account in question, it was banned, and its videos were removed for violating the platform’s Community Guidelines on bullying and harassment.

A spokesperson for Meta told Glamour: “Our glasses have an LED light that activates whenever someone captures content, so it’s clear to others that the device is recording and features tamper detection technology to prevent people from covering that light.

"Our terms of service clearly state that users are responsible for complying with all applicable laws and should not tamper with the product. As with any recording device, including phones, people should use smart glasses in a safe, respectful manner, which includes not engaging in harmful activities like harassment, infringing privacy rights, or capturing sensitive information.

“We are aware that there are small numbers of users who choose to misuse our products, despite the measures we have put in place. We are dedicated to delivering valuable, safe, and innovative products for people and continually review opportunities to enhance our AI glasses, informed by customer feedback and ongoing research.”

Read More
‘How dare anyone take my image and manipulate it?’: Naga Munchetty speaks to GLAMOUR about the scammers sharing fake nude images of her online

The BBC journalist and author is one of many high-profile women who have been targeted in this way.

Image may contain: Naga Munchetty, Face, Head, Person, Photography, Portrait, Black Hair, Hair, Body Part, Neck, and Adult

Describing the lasting impact tech-facilitated abuse can have, campaigner-survivor Charlotte Meijer said:

“I don't think people quite understand what it's like to be watched at all times. It's almost like, if you're being watched in a house, everyone can understand that you can't leave the house because he's watching you. Being tracked is very much the same thing because all eyes are still on you. He knows where you are, what you're doing, who you're with, and it is exactly the same. It's just in your pocket.”

Charlotte reported her perpetrator to the police. He was arrested and charged with coercive control, which included the tech-facilitated abuse she experienced, but he was found not guilty in court. As part of Open Justice for All, Charlotte recently successfully lobbied the government to allow all victims of sexual assault free access to their sentencing remarks.

Dr Charlotte Proudman, a family law barrister and author of He Said, She Said, explained how perpetrators are utilising technology to carry out their harm:

“I see tech-facilitated abuse as a direct extension of coercive control. Perpetrators can use tracking devices and apps to maintain power and control post-separation, to intimidate, to ‘check up’ and to remind women that they are never truly free from surveillance”.

Reflecting on the legislation around tech-facilitated abuse, she adds, “Law and the courts are slow to catch up with the reality that control no longer requires physical proximity.”

Read More
The family courts are failing to protect women and children from abusive men

“She didn’t report it, so she’s lied,” they said. “She didn’t leave, so it can’t be true.”

Image may contain: Blonde, Hair, Person, Adult, Face, Head, Photography, Portrait, Blazer, Clothing, Coat, and Jacket

For Molly, as a content creator, she can see the positives in tech-innovation, but is left feeling deflated at the constant hijacking of tech to harm and humiliate women:

“It’s just such a shame because I actually think this tech is so cool. When I first saw it, I was like, " Wow, imagine the amazing vlog footage you can get, the family holiday videos. But it’s just a huge fucking eye roll that once again it’s men – because it really is only men I’m seeing this from currently – who are ruining something that was a great piece of tech.”

Reflecting on this pattern of harm, she adds, “Anything can become a weapon in the wrong hands, and it’s always women who bear the risk because we live in a skewed patriarchal society whereby men still hold more power than women, and this proves it once more.”

It’s a frustration echoed by Anika, who feels disappointed at the dismissive attitudes around this harm:

“The people I see being put the most at risk by this technology right now are young women and femme-presenting people. People who have been unknowingly filmed in this way, including myself, face objectification and sexual harassment as an inherent quality and result of this technology paired with the current culture online”, adding “The fact that so many people refuse to take this seriously is really discouraging; dismissing this issue will only make it worse.”

Charlotte Archibald, the manager at Vale Domestic Abuse Services, believes it’s time tech companies and policymakers treat tech-facilitated abuse as a genuine, escalating safety issue:

“The same tools that help people connect, navigate, and share their lives are now routinely weaponised for surveillance, coercion, and control, especially in abusive relationships. Just because abuse is online and isn’t physical violence, it doesn’t make it less serious. This needs to be treated as a national emergency.”

As tech bros shake hands on billion-pound deals and race to stake their claim on the future of AI, it is women and marginalised genders who are left to absorb the real-world fallout. Tech innovation remains one of the fastest routes to immense profit, but when growth is prioritised over safeguards, tech-facilitated abuse becomes collateral damage. In the rush to build the next big thing, women’s safety is not being overlooked; it's being traded.

For more information about emotional abuse and domestic abuse, you can call The Freephone National Domestic Abuse Helpline, run by Refuge on 0808 2000 247.

Refuge’s National Domestic Abuse Helpline 0808 2000 247, available 24 hours a day 7 days a week for free, confidential specialist support. Or visit www.nationaldahelpline.org.uk to fill in a webform and request a safe time to be contacted or to access live chat (live chat available 3pm-10pm, Monday to Friday). For support with tech abuse visit refugetechsafety.org.