The internet has never been a particularly safe space for women and girls, but in the past decade or so, it's taken a turn for the worst.
We've all felt it. Last year, Cambridge academic Dr Ally Louks was bombarded with online abuse, including rape and death threats. Her crime? Posting a selfie with her dissertation on X. “You are the dumbest f*cking bitch I have ever seen on the internet,” one user commented. “Imagine thinking you deserve taxpayer money for writing up that useless piece of sh*t thesis that nobody will ever read.”
Once the preserve of fringe online forums, incel rhetoric has been steadily bleeding into the mainstream, emboldened by the rise of misogyny influencers like Andrew Tate (not to mention the return of President Donald Trump) and the deregulation of tech giants like Meta and X.
Online abuse against women is not inevitable. Since the Online Safety Act became law in 2023, tech companies (from search services to social media apps) have a responsibility to protect people in the UK from online harms, including those that disproportionately impact women and girls, such as domestic abuse, deepfake abuse, and online misogyny. And it's up to Ofcom, the UK's communications regulator, to publish guidance and hold tech companies accountable.
Earlier this week, Ofcom released guidance on how tech companies can protect women and girls online. Right on cue, the internet was awash with angry and abusive men describing the guidance as sexist.
One wrote, “Only focusing on misogyny is disgraceful sexism,” while another went as far as saying, “Ban women and children from accessing digital networks, allowing access only through a a fully certified, accredited male gatekeeper. Simple.” As broadcaster and campaigner, Jess Davies points out, “Women’s ‘mean’ comments about men do not even begin to skim the surface of the vitriol and deep-rooted misogyny which exists online that is putting women in real harms way.”
We tend to think abuse only happens on the ‘dark web’, but that's not the case.

The guidance identifies nine areas where tech firms must do more to improve women and girls’ online safety, which promote a “safety-by-design approach” and incorporate safety into the “operation and design of their services, as well as their features and functionalities.”
Ofcom has also set out a range of practical measures that tech companies can make, including embracing technology to prevent intimate image abuse, training moderation teams to deal with domestic abuse, and adding user prompts to encourage people to reconsider before posting harmful content.
For image-based abuse specifically, Ofcom's Jessica Smith, says, “Tech firms should sign up to a technology called hash-matching, which is basically a database of images which enables any image to be identified at scale wherever it is shared on a platform.
“It is really innovative technology. What that means is it does not have to be reported every time it is uploaded. It means it is reported once and wherever it exists it is identified.”
Cally Jane Beech, a survivor of deepfake abuse and GLAMOUR's Activist of the Year, supported the guidance, noting, "I want things to be better, for my daughter, and for women and girls all over the UK. We should all be in control of our own online experience so we can enjoy the good things about it. Tech companies need to be made more accountable for things being hosted on their sites.”
Instagram content
In a dream world, tech giants would be falling over themselves to make their platforms safer for women and girls. Sadly, we live in a world where the founder and CEO of Meta Mark Zuckerberg, has called for more “masculine energy” in the workplace; where Elon Musk, CEO of X and senior advisor to President Donald Trump, calls Jess Phillips, Minister for Safeguarding and Violence Against Women & Girls, a “rape genocide apologist”; and where “tech bro” culture is forcing women, particularly women of colour, out of the industry.
Racheal Alake, a solicitor at Bolt Burdon Kemp says, "While Ofcom’s draft guidance is a positive step, its voluntary nature risks allowing companies to shirk accountability."
Andrea Simon, Director of End Violence Against Women Coalition (EVAW), echoes these concerns, noting, “Ofcom is hamstrung by the fact that the proposals are voluntary only, with no actual requirement on tech companies to put in place any of the recommended good practice. Key to this work will be the routes through which the regulator will incentivise, and track take up of the guidance.”
She continues, “In a landscape where protections for users are being eroded, with a general trend of tech providers delivering the bare minimum when it comes to safety, any next steps from the new government in securing an internet that is safer and freer for women and girls must be to introduce these recommendations into a code of practice.”
Dame Melanie Dawes, Ofcom Chief Executive said: “No woman should have to think twice before expressing herself online, worry about an abuser tracking her location, or face the trauma of a deepfake intimate image of herself being shared without her consent.
“Yet these are some of the very real online risks that women and girls face today - and many tech companies are failing to act.
“Our practical guidance is a call to action for online services - setting a new and ambitious standard for women and girls’ online safety. There’s not only a moral imperative for tech firms to protect the interests of female users, but it also makes sound commercial sense – fostering greater trust and engagement with a significant proportion of their customer base.”
For more from Glamour UK's Lucy Morgan, follow her on Instagram @lucyalexxandra.
The government has announced plans to make it illegal to create explicit deepfake images, but we still need more information.


