Seyi Akiwowo: ‘We need to rethink how we define women’s freedom and expression online’

The stakes have never been higher in the fight for online safety.
Image may contain Face Head Person Photography Portrait Accessories Earring Jewelry Happy Smile and Blouse seyi akiwowo
Photo by Tanda Kabanda

The digital world is becoming an increasingly dangerous and uncertain space, particularly for women. We probably don't need to remind you of all of the potential pitfalls of spending time online: research by the Open University found that 15% have experienced online violence.

We know just how easy it is for people to track us down from our social media posts. In 2023, Instagram failed to tackle a network of accounts seeking underage sexual content. Pornographic deepfake technology is already a massive threat to all women, whether they have an online presence or not. One of this year's most talked-about shows, Adolescence, also served as a reminder that unregulated online spaces can quickly become locales for far-right, misogynistic radicalisation of young boys and men.

But, of course, technology is an essential part of our day-to-day lives. As pleasant as going off-grid may sound, it's probably not practical. We need technology. But we also need it to be safe.

Seyi Akiwowo is a tech and democracy advisor whose work focuses on creating a safe digital world. Author of How to Stay Safe Online and founder of Glitch, the UK charity that helped shape the Online Safety Act, and creator of the Digital Citizens Circle, her work offers real hope in what can often feel like an insurmountable challenge. This June, she was a speaker at the Cannes Lion Festival, engaging in active discussions and roundtables around youth safety and responsible investment, “challenging the room to move beyond performative safety and toward industry-wide accountability.”

This year, Akiwowo is thinking big picture. 'It’s time to consolidate and work in partnership with global changemakers, power players, and leaders committed to systemic impact," she said, adding, “Safety, joy, and justice online aren't nice-to-haves, they're essential. My belief didn't come from a textbook, but from experiencing both the infinite possibilities of the internet and enduring its hardest harms.”

We spoke to Akiwowo about how she's pushing for a safer digital world and why the stakes in the fight for online safety are so high for women, particularly marginalised women.

Read More
Why don’t Black women get to see their coming-of-age stories on screen?

As two new comedies fail to feature Black female leads, Isabella Silvers asks why this pattern persists, and who is responsible for changing it.

Image may contain: Kristen Miller, Maria Flor, Adult, Person, Wedding, Head, Face, Happy, and Smile

What are the main challenges facing women, particularly Black women, in today's digital landscape?

Over the last 8 to 10 years, we’ve seen some progress from tech platforms: improvements to content moderation, policy changes, and more partnerships with marginalised communities. But in 2025, the real challenge isn’t just drafting better policies. It’s how these policies are enforced, the standards behind that enforcement, and whether it’s genuinely about safety — or just a box-ticking exercise ahead of regulation.

For Black women especially, the stakes couldn’t be higher. We sit at the sharp intersection of racism and misogyny, what many call misogynoir — and too often, that reality is ignored when safety tools are designed or moderation policies are rolled out. Regulators are only just starting to catch up with how online misogyny affects different women and gender-expansive communities, including through the new Code of Practice on Violence Against Women and Girls in the UK’s Online Safety Act.

Nuance is crucial. Without it, safety measures can become another form of harm — policing Black voices online, silencing joy, protest, or cultural expression in the name of protection.

Take the recent discourse around the Diddy case. It exposed serious fault lines: how unchecked misogynoir, banter culture, and meme-driven abuse flourish even in spaces created by and for Black people. So we have to ask: are platforms and regulators really ready to deal with that complexity? Are AI tools and human moderators being properly trained to spot harmful narratives without reinforcing bias? Are we building systems that understand Black communities — or simply ones that monitor us? Until we stop treating Black women’s safety as a side note — both online and offline — we’ll keep mistaking control for care.

Read More
‘I investigated deepfake technology and found hundreds of men using it to create non-consensual porn’

One post simply requested ‘F*cked on her back please’ alongside an image of a woman fully clothed, holding a baby.

article image

What does joy and justice look like online?

Joy and justice online happen when trust and safety are treated as business fundamentals — not afterthoughts. When users feel safe to express themselves, to be fully human, to make mistakes, to be inspired by healthy role models and uplifting conversation — they stay longer, spend more, and become brand advocates.

Platforms that optimise for time well spent rather than raw watch-hours tend to see higher retention and lower churn. Pinterest, for example, reduced self-harm searches by 80% after shifting to wellbeing-first metrics — and daily active users increased. The takeaway? Healthier feeds mean healthier audiences that stay.

In 2021, as Founder of Glitch, I partnered with BT Sport and EE to launch the award-winning Draw the Line campaign. We combined real-time abuse detection with practical user actions: Spot. Report. Support. The campaign generated nearly £1 million in earned media and led to a 25% drop in abusive tweets within 48 hours of launch.

Justice online means moderating both context and ideology. AI models trained with dialect-rich datasets can reduce false positives for Black British vernacular — safeguarding freedom of expression while still tackling harmful content and negative stereotypes. I personally long for the day algorithms stop perpetuating misogynoir and misogynistic content — either by failing to take it down or, worse, amplifying the “strong, angry Black woman” trope. Balanced systems avoid over-policing marginalised communities and help brands steer clear of the reputational crises that follow when safety misfires.

How do we protect wellbeing?

Shift the success metrics. Track “positive engagement” like saves and meaningful comments, not just rage clicks.

Budget for care. Ring-fence at least 5% of campaign spend for safety tooling, moderator training, and community education.

Design deliberate pauses. Introduce friction points like daily scroll limits and wellbeing nudges that encourage rest without harming revenue.

Read More
Black Cowgirls paved the way before Beyoncé's Cowboy Carter

Erased from mainstream narratives, Black cowgirls are reclaiming their story

Image may contain: Adult, Person, Animal, Horse, Horseback Riding, Leisure Activities, Mammal, Clothing, Hat, and Car

Co-create with users. Bring those most affected by online harms into your product labs. Their insight reduces rework and reputational risk.

Audit. Publish. Improve. Treat wellbeing metrics like carbon data: disclose them, benchmark progress, and iterate transparently.

When brands treat digital wellbeing as a growth lever, joy and justice stop being nice-to-haves. They become strategic advantages — powering more resilient platforms, stronger user communities, and a healthier internet for the next generation.

What needs to change for women — especially Black and marginalised women — to participate as freely as men?

First, I think we need to be curious about the question itself. If the benchmark is to be as free as men, we’re aiming too low. Patriarchy doesn’t just restrict women, it also distorts how masculinity and gender-expansive identities are expressed online. It punishes vulnerability, narrows self-expression, and fuels harmful dynamics across the board.

So the long-term goal isn’t simply inclusion in a flawed system. It’s a fundamental redesign, one grounded in care, autonomy, humanity, and accountability.

That shift starts with changing who gets to build, moderate, and monetise digital spaces. We need more values-driven people — especially those with lived experience — shaping product design, trust and safety roles, investment committees, and boardrooms. Not as tokens, but as decision-makers.

And just as importantly, we need to rethink how we define women’s freedom and expression online. It’s not about copying how men operate under patriarchal systems. It’s about creating digital environments where all identities — especially those historically marginalised — can show up without fear, performativity, or erasure.

That’s the bar. And it’s time we raised it.


Seyi Akiwowo is an award-winning online safety strategist, speaker, and author of How to Stay Safe Online. As the founder of Glitch and creator of the Digital Citizens Circle, she advises global brands and platforms on building responsible tech and supporting safer online communities.

After eight years of setting up #FixTheGlitch and founding Glitch the charity, Seyi transitioned on 1 January to set up 21/20 Studios — a creative critical friend providing strategic advice to brands, platforms, and policymakers wanting to shape a responsible tech future.

Find out more at seyiakiwowo.com.