Three years ago, I published How to Stay Safe Online, a book that explored how to build boundaries, resilience, and community in the digital world. I wrote it as a survival guide, but also as a manifesto for collective care. As I celebrate its anniversary, I’ve found myself returning to the same question that inspired it: why are we still being told that safety must come at the expense of privacy?
That false choice continues to shape how online safety is debated, especially in the wake of the UK’s new age-verification policies. The outrage wasn’t really about privacy; it was about friction. People were furious at the idea of having to pause before accessing porn, as if the inconvenience itself was an attack on liberty. Yet this outrage plays out in a context where violent porn is being algorithmically amplified and normalised, feeding a culture of violent misogyny that young people then internalise.
I worry about what it says when we’re more comfortable defending “frictionless” access to porn than defending children’s right to safety. If we refuse to evolve, we’re asking the next generation to navigate a Wild West Web alone — one built on profit, not principle. Digital citizenship, to me, has always been about collective responsibility: a social contract between platforms, policymakers, and people.
The stakes have never been higher in the fight for online safety.

When I spoke with Carolina Are, an academic and content creator who’s long advocated for sex workers’ digital rights, she reminded me that “safety” isn’t a universal experience — especially when your livelihood depends on being visible online.
“Sex workers,” she said, “often exist at the intersection of surveillance and safety. We’re told these systems are for our protection, but they end up policing our bodies instead.” Carolina’s words stayed with me — because they expose the blind spot in so many online safety policies: that women’s safety is too often built on someone else’s control.
Azmina’s reflection echoed this perfectly:
“My experience in early debates over freedom of expression and online violence taught me how easily women’s voices can get sidelined when rights are framed as competing rather than connected... We must hold both truths at once, and co-design policies that see privacy as part of safety.”
Listening to her, I thought about how the Online Safety Act was sold as a fix for everything — from child protection to terrorism — yet still left so many people, especially women and survivors, feeling unheard. It’s not that regulation isn’t needed; it’s that when “safety” is defined without us, it’s rarely built for us.
When Sabrina and I spoke, we kept circling back to one idea: if you frame privacy as a compliance issue, you miss its human impact. We’ve built an online world where “consent” is a box-ticking exercise — scroll, click ‘accept’, move on. But what if privacy were part of a public-health model for digital life? One built around prevention, intervention, and recovery rather than paperwork and punishment.
Sabrina said it best: “The system treats breaches like isolated illnesses, when really they’re symptoms of the same environment — an unhealthy internet.” I found myself nodding, thinking about the times I’ve delivered digital-citizenship workshops where young people described burnout from constant exposure, not just harm from a single incident.
Julie Dawson from Yoti, where I sat as an ethical board advisor for six years, and Jarek Sygitowicz from Authologic offered two sides of the same coin. Julie argued that ethical design has to start with dignity, not data. Jarek described how the next generation of e-IDs can prove you’re over 18 without revealing who you are. Both saw the potential for safety and privacy to reinforce each other, not compete.
Still, I keep hearing policymakers talk about “compliance” like it’s care. It’s not. Real care is making sure that when consumers of any identity, especially those vulnerable to the system, report abuse or a data leak, they’re met with empathy, not bureaucracy.
Technology-facilitated abuse is on the rise, yet the digital world also holds the power to create a safer future for women and girls. We need to fight for the latter, says Refuge ambassador Zara McDermott.

When I asked David Babbs, long-time campaigner and civic-tech leader, whether the growing number of “ethical tech” pledges were making any real difference, he smiled and said: “Not if the business model stays the same.” He’s right. For all the glossy sustainability reports and pastel mission statements, most tech platforms still rely on outrage and attention — the very things that make us all feel unsafe online.
Ethical tech is easy to say. Safe tech costs money. And that’s the crux of it: responsibility without redistribution is just PR. David argued that as long as ad-driven profits depend on engagement, tech companies will keep building for addiction, not wellbeing.
That tension came through again in my conversation with powerhouse writer, Lovette Jallow, who spoke about being neurodivergent online. “The digital world is built for speed, not for sensitivity,” she said. “When you’re neurodivergent, the pace of the internet isn’t just overwhelming — it’s alienating.” Her words reminded me that conversations about safety can’t stop at content moderation; they have to include design itself.
So much of what we call “safety” is really about friction — slowing down harmful interactions before they spiral. But in an economy where attention equals revenue, friction is treated like failure. We can co-design safety frameworks with care, not just harm reduction
She chatted to us about beauty standards, diversity in modelling, and existing online as a woman

In one of my conversations, Hannah Swirsky from the Internet Watch Foundation told me: “The biggest mistake we make is assuming everyone experiences risk the same way.” She’s right. Safety frameworks tend to flatten differences, turning nuance into checkbox diversity.
Madhuri, Co-founder of content moderation tool Welivedit.AI expanded on this: “Policies rarely account for sensory overload, language barriers, or the way some of us process harm.” Safety isn’t just about blocking content — it’s about designing systems that don’t exclude people by default.
I’ve been thinking about how movements led by women, queer people, and disabled creators are already modelling this. They build safety through community, not code; through joy, not just justice. Carolina Are called this “hope as infrastructure,” and I love that phrase. Because the most radical thing we can do in tech right now might not be inventing the next algorithm — it might be resourcing the people who keep others safe.
“I knew that I was going to make a change; I just didn't know how.”

Every conversation I’ve had over the last few months — whether with policy leads, sex workers, technologists or parents — kept coming back to the same truth: online safety isn’t a destination, it’s an evolving relationship. Between users and platforms, between communities and policymakers, between privacy and accountability.
I don’t want to live in a world where safety means surveillance, or where privacy means isolation. As digital citizens, we deserve both. And that’s the future worth fighting for: one where regulation protects without punishing, and connection doesn’t come at the cost of dignity.
Because the truth is, no one wins in the false choice between safety and freedom. But if we centre care, equity, and imagination, we might just build an internet that lets us all breathe easier.
“I contemplated stopping posting images of myself but I didn’t want to let the misogynistic trolls get their way."

_Nick_Fancher_Photos_ID6115.jpg)
