Pregnant women have a lot to deal with. That’s a sentiment that surely anyone would find hard to disagree with. There are the bits we can all see; the expanding kettlebell strapped to their front, the unshakable exhaustion, the responsibility to keep their sudden Russian Doll-like body safe and healthy. There are also the bits we don’t; the fluids, constipation, and the haemorrhoids that come out to say hello as you slide into a hot bath.
And now, we’re learning that artificial intelligence from tech companies like Google and Microsoft are rating photos and videos on social media as racy and sexual simply because they feature a pregnant belly.
AI tools were originally developed to protect internet users from seeing content that is violent or pornographic in nature. However, according to a new report from The Guardian, they also “rate photos of women as more sexually suggestive than those of men, especially if nipples, pregnant bellies or exercise is involved.
“As a result, the social media companies that leverage these or similar algorithms have suppressed the reach of countless images featuring women’s bodies,” the report adds, suggesting that a “built-in gender bias” may be to blame.
“Nudifying” AI tools that “undress” women in photos are gaining traction.

I was pregnant myself a couple of years ago, and it makes my skin crawl to imagine my body being sexualised, whether by real people or technology.
My favourite part of pregnancy – and I’ll be honest, there weren’t many – was feeling like I was never alone. I always had this little person with me; someone to touch, someone to talk to, someone who I could feel existing underneath my skin, under my palm. It was a connection like no other. My body was stretching, nurturing, adapting and sacrificing. What it wasn’t doing was anything even remotely sexual or ‘racy’. To put such labels on it repulses me, and it should anyone.
"When I became pregnant the first time, I actually thought I'd get a break from the hyper-sexualisation and objectification of my body.”
Unfortunately, these horrifying revelations aren't the only factor we must worry about. As a result of this suppression, content featuring certain images, including those with a pregnant belly, may consequently be ‘shadowbanned’. This is when a platform limits the reach of a post without notifying the account holder that it has done so. There are real-world consequences to this; as the report highlights, “it can hurt female-led businesses – further amplifying societal disparities.”
Ashley James is a presenter and DJ with 334K followers on Instagram. She is a mum-of-one in her third trimester with her second child. I ask Ashley whether she believes videos of her pregnant belly have ever been shadowbanned. She is adamant that they have, and a quick scroll through her Instagram Reels backs up her suspicions; a video of her heavily pregnant, in underwear receives 40.7K views, yet is surrounded by other videos where her stomach is covered, which have up to 183k views.
“I find it devastating and disgusting, that AI is telling us our bodies are nothing more than sexual objects, including when that body is growing or feeding a child. When I became pregnant the first time, I actually thought I'd get a break from the hyper-sexualisation and objectification of my body,” Ashley tells me.
Instagram content
Ashley is a big advocate for showing women’s bodies as they really are; cellulite, scars, rolls, lumps and all. “I want to remind us all that these are not flaws but very normal body parts and that we don't need to shrink ourselves to be happy and confident and loveable. I think it helps to fight back against unrealistic beauty standards, photoshop and diet culture.”
Not only has Ashley seemingly had posts suppressed, but she has also received abuse for the content that people were able to see. “The objectification of our bodies and hypersexualisation of breasts is so engrained in society, even doing something as pure as feeding a baby is sexualised or seen as something you are doing for men. I'd receive a lot of messages saying I was attention-seeking, including from women.”
The report makes it clear that it is not bare skin that flags a supposed issue. When a male Guardian journalist posed in jeans and a bare chest, Microsoft’s algorithm scored it lower than 22% for the likelihood of raciness. Yet when he put on a plain black bra, the score jumped to 97%. Holding the bra to his side clocked the score up to 99%. The algorithm believed that the bra was an inherently racy object, not the regular item of clothing it actually is.
This sexualisation of both bodies and objects, that have been built into these AI tools, reflects real life. As Ashley points out to me, “a topless man can walk through the park on a sunny day, and no one bats an eyelid. We don't assume he's doing it for female attention or to objectify him. We just allow him to exist in his body. When will we stop telling women that their bodies are tempting and that’s our fault, and start to challenge and educate men not to objectify us?”
It’s a brilliant thing that these findings are coming to light; if we know these issues exist, we can hold those who responsibility it is to tackle them to account and ensure meaningful change is made. A spokesperson for Google told The Guardian, “This is a complex and evolving space”; they’re not wrong. But if these tools can be created, they can also be developed and adapted. We should all be watching closely to ensure that this happens.
Neglecting your needs leads to anger and resentment, especially if it falls to you to make sure everyone else is fed, entertained and getting enough sleep.

