Shock horror: AI is over-sexualising and censoring women's bodies

woman in a yellow vest behind distorted glass giving the impression she's been pixelated or censored
Shock horror: AI tools discriminate against womenRoc Canals - Getty Images

When it comes to deciding which pictures and videos we see online, we may like to think we're in control, on our big, endless scrolls. Well actually, AI (artificial intelligence) algorithms hold much of the power. And, as it turns out, these AI systems are prone to mistakes – mistakes that negatively impact and over-censor women on social media.

It shouldn't come as a huge surprise, given that the majority of modern advancements in tech, designed to make our lives happier and easier, seem to end up disproportionately and negatively impacting women in some capacity. See our recent reporting on the rise of deepfake pornography, as just one sinister example.

A new piece of research from The Guardian and the Pulitzer Center’s AI Accountability Network, building on years of posts from activists and a paper exploring shadowbanning by Dr Carolina Are, found that AI systems used by the likes of Google, Microsoft, Meta (Instagram and Facebook) and LinkedIn, have a clear gender bias.

Evidence now confirms that AI tools, designed to stop violent, hate-fuelled or unwanted pornographic content from slipping through the net, tend to rate content produced by women as more sexual than like-for-like male equivalents – further highlighting the wider problem of the female form being routinely objectified and over-sexualised. Need an example? Think: a photo of a woman in a bikini will likely be deemed more explicit by an AI algorithm than a topless man in swimming trunks. Ditto a woman in a sports bra and shorts vs. a topless man in workout shorts.

The result of these AI tools blocking plenty of content (featuring women) that shouldn't fall into either of those categories is something that can be particularly harmful for plenty of people online. That could be business owners and creatives, like photographers who rely on Instagram to promote their work. It can also impact the LGBTQIA+ community. It's also a real cause for concern for sex workers, who use social media to advertise their services, and who are being censored to the nth degree (irrespective of how 'racy' their content actually is).

Fitness content, bare skin pregnancy and breastfeeding images, art or fashion pictures (that show a hint of nipple *eye roll*) are amongst those routinely buried in cyberspace too, even medical-based photos can be incorrectly labelled by AI tools. For their Guardian investigation, AI experts Gianluca Mauro, founder of AI Academy, and journalist Hilke Schellmann confirmed that one image demonstrating how to carry out a breast exam was ruled by both Microsoft and Amazon's AI tools as 'sexually explicit'. It's something we've long known about here at Cosmopolitan too, and have to keep in mind when choosing imagery to illustrate health stories.

"Policies – or 'community guidelines' – on social media right now lump all nudity in with sexual activity, blending bodies with sex acts, and coding women and LGBTQIA+ bodies as more inherently sexual than men," agrees Dr Are, an Innovation Fellow at Northumbria University's Centre for Digital Citizens. "And often there's no easy way to actually question these decisions with a real human, or to push back if your content gets removed or shadow-banned [suppressed without any notification] unless you have contacts at platforms."

But why is this happening? How can we stop it? Should we always push to stop it?


"Tools categorise women as sex objects, no matter what they're doing"

The fact that the majority of the tech industry's workforce is comprised of cis men (who create the AI algorithm coding and tell it what is – and isn't – appropriate) could play a key part in why women's bodies are dubbed more sexual then men's, says Dr Are. "I don't believe tech employees actively think: how do we make women's lives harder? But I do think they have unconscious biases, and so the tools they create categorise women as sex objects, no matter what they're doing. My research has also shown LGBTQIA+ expression is often [incorrectly] picked up as sexual."

As well as sharing her research with her 25,000 followers, Dr Are also regularly posts impressive videos and images of herself pole dancing (she's a qualified instructor) – some of which she says have been excluded from Instagram's Explore page when she's used fitness-based hashtags to try and reach a wider audience. "Pole really helped me to love my body after a traumatic experience, and it was crucial for my healing and for finding a community," she shares, adding that Meta have since invited her to feed back on their policies. The issue, it seems, is a real lack of nuance – and a need for more thorough parental controls and content-filtering options.

"I can confidently say that current platform rules, along with their governance frameworks and processes, are not nudity-friendly. Sometimes they're actively sex-negative," Dr Are adds. She cites concerns about sex workers in particular being virtually muted when trying to promote their services (and sustain their livelihoods). She also points out the double-standard between celebrities and non-famous folk, using the movie Hustlers as an example. The film sees Jennifer Lopez and Cardi B portraying strippers and pole dancing. Trailers for it dominated Instagram, while real strippers and pole dancers lost earnings from being de-platformed, or having their (like-for-like) content deleted.

"I'm worried about how generic the Online Safety Bill sounds in terms of porn detection too – it dictates social media bosses will face heavy punishment for "not protecting children" but gives a very loose definition on what constitutes "psychological harm". It feels like machine learning (AI) has become the puritan police and picks up absolutely everything, not just the harmful content it ought to, such as sex trafficking images or videos," Dr Are continues. "Sex workers who promote their camming or content platforms are suffering as a result, as are small businesses selling lingerie or sex toys. Models, photographers, activists, artists, pole dancers too. Yet, strangely, not celebrities posting equally if not raunchier content.

"I'm sympathetic to the fact that platform workers are dealing with huge swathes of content [passed over to the human team after being weeded out by AI algorithms] and that running these spaces are no joke," she adds. "However it's not right that nudity and sex are automatically put in the box of 'bad things to regulate' and there's no nuance." This lack of refinement is also a problem that chatbots, like the much buzzed-about ChatGPT, are being criticised for too.

Of course, all this isn't to say that shadowbanning or removing content should be stopped entirely, as they serve a purpose to impede racist, explicit, violent and hateful content. Parental controls matter too, when it comes to concerns around kids and teenagers using social media. But it's clear a lot more needs to be done when it comes to fine-tuning the tools that have great sway over what we consume online, and therefore the information or inspiration we're granted access to.

In response to accusations of gender bias put forth by The Guardian, a Google spokesperson said: "This is a complex and evolving space, and we continue to make meaningful improvements to SafeSearch classifiers to ensure they stay accurate and helpful for everyone."

When approached by Cosmopolitan UK for comment, a spokesperson for Meta said: "We don't allow nudity and sexual activity because of safety considerations around age, consent and exploitation. While we allow sexually suggestive content, we think it's important to set a higher bar for content we recommend, when people haven't chosen to follow those accounts. We've also been focused on giving people more transparency and control over their experience on Instagram. We published our recommendations guidelines and launched Account Status (a one-stop-shop to see if any of your posts have been removed, to appeal, to see if you're at risk of having your account disabled, and to check whether your content is eligible to be recommended, and if not, take remedial steps).

"Our goal is never to target a particular community, but to make sure Instagram can stay a place where everyone can connect over what they love, while feeling safe."

Meta added that it's a challenge creating an adult nudity and sexual activity policy that caters to all users, who are global and include teenagers as young as thirteen. "We really encourage our community to read our policies to help them understand exactly where we draw the line, so they can avoid inadvertently breaking our rules and having content removed," the Meta spokesperson said. "We also launched the Sensitive Content Control in 2021 which allows people to choose for themselves how much sensitive content (the kind of content that doesn't break our rules but that we try not to recommend) they want to see from accounts they don’t follow. Meaning, if adults opt-in to the 'more' setting, we'll show them more sensitive content - including more sexually suggestive content - in their recommendations."

As for moving forward, one thing Dr Are says she'd love to see is the end of platforms lumping all nudity in with sex, as well as sex and sex work always being seen as harmful. "I'd like them to listen to the most marginalised users, such as sex workers and transgender people, and I'd like them to stop using a one-size-fits-all approaches to content moderation, because these only generate false positives. Ask users what they'd like to see online, both when they sign-up and regularly thereafter, without pre-empting their choices." Investing in human moderation, along with better customer service to review decisions would go a long way too, she adds.

This lack of nuance from the AI software that features so prominently in our daily lives, whether we're aware of it or not, is the perfect example of why Big Tech need to ensure a real effort is being made to protect marginalised groups, and women in general, from feeling the brunt of the negative downsides that come with it. Technology can be a beautiful thing, something that helps us to thrive and live easier, more joyful and more connected lives – but only when it treats everyone as equals, whilst protecting against genuinely harmful imagery in the process. Something we know, sadly, isn't the case right now.

Follow Jennifer on Instagram and Twitter

You Might Also Like