This article references rape and sexual abuse.
For a decade, Dominique Pélicot orchestrated the drugging and rape of his wife, Gisèle Pélicot, by 72 men he recruited through the notorious Coco website (now shut down). Fifty of them were identified from the 20,000 videos he meticulously filmed. And we learned all of this through a public trial (in which he and 51 men were found guilty).
As an activist who started Chayn, a non-profit against gender-based violence and, specifically, technology-facilitated gender-based violence, I am pleased to see that this trial has shattered the myth that these crimes only happen on the dark web by “monsters”.
While lawmakers and law enforcement have been going after the 4chan’s of the world – hundreds of websites like Coco have been flourishing right under our gaze, slipping through antiquated legal cracks which absolve platforms from having any responsibility to moderate and prevent these crimes. Labelling men who partake in such misogynistic and violent websites as monsters like the press has been calling Dominique Pélicot (“Monster of Avignon”) might feel comforting (surely these men are few and lonesome, “mad” men), but in this case, this myth has been unravelled quite painfully too.
The men who chose to rape an unconscious and lifeless Gisèle Pélicot had all visited the unmoderated Coco website and spoken to Pélicot on the forum “without her knowledge”. This website, founded in 2003, was known to law enforcement for a long time with multiple complaints from NGOs who monitored it.
Between Jan 2021 — May 2024, over 23,000 legal proceedings were opened against Coco by 480 victims. The platform was unmoderated, and in the instances that a user would get banned from the site for breaking the code of conduct, paying a mere €10 would reinstate the account. Content moderation and codes of conduct that are meaningful and effectively enforced have long been an area of concern and campaigning by civil society and affected groups.
But Coco is just one of many sites. A recent investigation by CNN during the Pélicot trial found that even on just one website (not on the dark web), rape and sexual abuse were being actively discussed by users showing similar patterns, and a recently exposed Telegram chatgroup (private and unmoderated) of more than 70,000 members was found to be a hotbed of discussions around how men have raped their sisters and mothers as well as offering their wives to be sexually abused by others. The underbelly of the internet isn’t just the dark web – it can be the everyday platforms operating in plain sight.
“The protests all over France showed that Gisèle is not alone. Every victim deserves to have that support.”

Technology-facilitated gender-based violence is rarely a case of a singular platform in a single country. It’s more complex. A conversation may start on a Reddit forum but move on to a private platform, like Telegram or an obscure private messaging platform. The users can be within a 30-mile local radius, as with Dominique Pélicot, or spread worldwide, making it harder for platforms and law enforcement to track and charge them. This is why a multi-platform and multi-jurisdictional approach is imperative to tackle online violence.
There are signs that this form of technology-facilitated gender-based violence is being taken seriously by policymakers and technologists alike, from a slew of global and regional laws and regulations that are tackling specific forms of such abuse, such as upskirting (the French law that eventually caught Dominique Pélicot) to image-based abuse (sometimes called ‘revenge porn’) in South Korea, Australia, the UK and the US.
In the UK, Stephen Bear was jailed for 21 months after he was found guilty of uploading footage of himself and Love Island personality and GLAMOUR Woman of the Year Georgia Harrison to OnlyFans in 2023 without her consent. On the other side of the world, the punishment can be much more significant. Recently, a man in Pakistan was jailed for nine years in prison on three counts for sharing intimate images and videos of his former fiancée on Facebook and Instagram.
For a long time, large social media platforms have been embroiled in legal battles to deny responsibility over content posted by users (such as Section 230 in the US). However, the recent Digital Services Act by the European Union, the Online Safety Act in the UK and Australia, as well as federal and state level bills in the US – have created a powerful shift holding companies responsible for ensuring their platforms are safe, moderated and making them liable to fines if that is not the case.

The fight to decipher abuse from non-abusive content among billions of multilingual user-generated content every hour is not easy – for large and small technology companies alike.
From Bumble’s auto-blurring feature (that detects and blurs genitals in a picture sent in a direct message), to improved age verification technology, image-hashing to detect cross-platform images of child abuse and Tiktok’s algorithms detecting and removing 80% of guidelines-violating content – technology is going to be the undeniable solution to tackle the problem at scale.
But there’s still the worry that too much reliance on AI and cutting of human content moderators is counter-productive – as context is often the missing link, especially in non-European languages and culturally-specific harms in the Global South (most AI is developed and trained on Western models).
These initiatives will fall short unless technology platforms and regulators work closely with survivors and civil society groups like ours. Cases like Dominique Pélicot expose the realities we often talk about but don’t make the headlines – giving us hope that action will be prompt and forceful.
Chayn is a global non-profit that supports survivors of gender-based violence through free online resources in many languages. This includes Bloom, which has courses on healing from trauma from domestic abuse, sexual violence and image-based abuse.
For more information about reporting and recovering from rape and sexual abuse, you can contact Rape Crisis on 0808 500 2222.
If you have been sexually assaulted, you can find your nearest Sexual Assault Referral Centre here. You can also find support at your local GP, voluntary organisations such as Rape Crisis, Women's Aid, and Victim Support, and you can report it to the police (if you choose) here.
The resources and helplines you need to know.



