My Instagram and TikTok accounts learned I was pregnant the day I took a positive test. Through my behaviours — like my searches to discover morning sickness remedies beyond potato waffles — I had been drip feeding my accounts with intimate details about my body. This was fine at first: fuelled by my new dose of pregnancy hormones, I enjoyed tearing up at videos of pregnancy announcements, tiny little crocheted cardigans and pictures of tastefully curated nurseries. But as I progressed through my pregnancy, the platforms’ algorithms turned on me.
I don’t think I’d even had my first midwife appointment before both my Instagram Explore Page and TikTok For You feed began to recommend traumatic pregnant-related posts. I learned about women who had experienced life-threatening births, saw infographics detailing rare pregnancy-related illnesses, and encountered devastating posts from families who had lost a baby. To be clear, people should be allowed to share these stories and there is demonstrable value in social media for those experiencing trauma and grief. The point I’m trying to make here is that I hadn’t yet experienced any of these things, so why was I being shown them?
If you are a frequent social media user, you will know that platform companies want to learn everything about you, algorithmically recommending what they deem to be relevant content to keep you using their services instead of their competitors’. But the problem is that their technologies aren’t capable — perhaps through lack of advancement, perhaps through lack of trying — of discriminating between helpful and harmful content recommendations.
“I watched as a heavily pregnant woman prepared to board a boat. I felt the weight of guilt and shame, that I could get back into my warm car and make the safe passage back to the UK.”

I know what you’re thinking: clearly I’d been searching for these bad things in the early stages of my pregnancy and my social media accounts were simply showing me what they thought I wanted to see. But here’s the thing: I hadn’t. Instagram and TikTok taught me about things I didn’t even know existed and thus couldn’t possibly have searched for. I had barely missed my period, let alone searched about the warning signs of polyhydramnios (a condition I was diagnosed with at 36 weeks but had already heard about through, you guessed it, a content recommendation). The problem, I came to realise, was that I had lingered on some traumatic posts that were recommended to me, likely out of shock, effectively sealing my fate.
This behaviour — ‘when you unconsciously scroll more slowly over certain posts’ — is known in the search engine optimisation (SEO) world as ‘dwell time’. But crude metrics on people’s dwell time risk glossing over some uncomfortable truths: by lingering, people are not necessarily liking the content they are seeing. In the early stages of my pregnancy, my Instagram and TikTok accounts knew I was only sharing or hitting the ‘like’ button on wholesome content about becoming a mother. The platforms had more than enough information to avoid torturing me; they knew full well what I wanted to see and when I wanted to see it. But they chose to ignore the data they had. They chose to be cruel.
This is not a new phenomenon. As Washington Post writer and then-new dad Geoffrey A. Fowler similarly described, within weeks of posting pictures of his new baby to Instagram, his Explore Page began showing him ‘babies with severe and uncommon health conditions, preying on my new-parent vulnerability to the suffering of children. My baby album was becoming a nightmare machine’. He, too, had lingered. And while I don’t believe social media use can ever be risk-free, I have to ask: isn’t there a danger in recommending content to people who are not engaging with it beyond dwell time?
Counteract back stiffness with this gentle, feel-good yoga move.

Social media platforms offer lots of so-called solutions to the problems I describe here. On Instagram, for example, you can reset your content recommendations: they ‘will start to personalise again over time, showing new content based on the content and accounts you interact with’. And on TikTok, you can block yourself from being shown content that contains hashtags and/or keywords of your choosing. But a simple dwell on a post that stuns you into inaction can reverse these careful choices and, of course, you can only block things you’re already aware of.
What I’ll be taking away from this experience is a healthy reminder that social media platforms do not reflect reality. We typically talk about the concept of ‘social media versus reality’ in relation to the content people share, which can depict enviable lifestyles. But we must also apply this notion to the mechanics of the platforms we use. Through their recommendation algorithms, social media platforms take elements of the truth but then distort and reflect them back to us, kind of like a funhouse mirror rather than a normal one. It doesn’t matter so much when platforms do this with trivial subjects, but this distortion matters when it comes to our health, our bodies, and with the things that make us most vulnerable.
Say goodbye to bump-induced aches and pains.
.jpg)
