Digital censorship of intimate health: women's issues undesirable
Women's health topics affect more than 50 percent of the world's population, but unlike search terms such as "erection", they are systematically censored.
Potency pill yes, tampons no – These often seem to be the rules for online ads.
(Image: fizkes/Shutterstock.com)
An info post about endometriosis. An ad for menstrual underwear. An influencer campaign on female pleasure. All three were deleted, restricted or algorithmically “throttled” – not because they violated community standards, but because they were classified as “too sexual”, “political” or simply “inappropriate” by algorithms and platform employees.
These are not isolated cases, but part of a larger structural problem: the digital censorship of sexual and reproductive health content, especially when it comes to women, non-binary people and other people with vulvas. Recently, the US Center for Intimacy Justice (CIJ) published a report on the algorithmic suppression of information and ads about female intimate health by major platforms: “The Digital Gag: Suppression of Sexual and reproductive health on Meta, TikTok, Amazon, and Google.”
Videos by heise
Of a total of 159 nonprofit organizations, influencers, sex coaches and companies surveyed, more than half stated that their ads regarding intimate health were rejected, shadowbanned or age-restricted – and disproportionately often when they were aimed at women or queer target groups.
In its analysis, the CIJ documents how the medical is confused with the pornographic – for example, when a post on clitoral anatomy is deleted while ads for Viagra run unhindered. The study reveals that many companies self-censor (“shadow edit”) their content on social media to avoid algorithmic blocks – often at the expense of medical accuracy or linguistic clarity.
“This censorship is no coincidence, but the result of historically evolved gender biases in platform guidelines, AI systems and advertising rules,” the report states. Particularly problematic: medical content such as information about endometriosis, contraception, or menopause is also affected. This massively restricts access to relevant health information, especially for marginalized groups who are already less well served.
Known problems in German-speaking countries
Melanie Eichhorn, sexologist at Satisfyer, a manufacturer of sex toys, says when asked: “We have been committed to providing education about sexuality and health for years. Unfortunately, we keep coming up against obstacles: Our content is restricted on social media or not displayed by search engines – even though many people are looking for serious information there.” These digital restrictions hinder important sex education work, which is often still neglected in schools, families, and society.
Katharina C. Trebitsch, co-founder of Nevernot, a manufacturer of soft tampons, also reports repeated rejections by Meta: “Our ads for intimate care products were regularly rejected or restricted. This makes it difficult to be visible and contributes to the fact that the topic continues to be taboo.” From a business perspective, it is almost impossible to scale a D2C brand in this area via Meta – which is why the move towards retail was all the more important for the company.
Roo Waissi, PR Manager Europe at erotic mail order company Lovehoney, is similarly critical of the systemic hurdles: “Our content is often blocked – not because it is explicit, but because we use anatomically correct terms such as 'vulva' or 'clitoris'. Even an Instagram post about endometriosis was deleted by Meta because one slide had the line “If in doubt, contact your gynecologist”. We now even censor words like 'period' or 'cycle' so that our content is not deleted. At the same time, we see that products for men such as sexual enhancers can be advertised without any problems – even with suggestive language. This shows the double standards very clearly.”
Economic impact for companies
The economic consequences of digital censorship are considerable: start-ups lose reach, invest inefficiently in ads and have to take creative detours to remain visible at all. At the same time, big players such as Amazon benefit by offering the same products in a more neutral way – and being favored algorithmically. This is not just about advertising, but also about knowledge, access, and justice. If a post about female sexuality is considered “offensive”, this is not just an issue of platform policy, but an expression of a social bias that wants to continue to standardize and control women's bodies.
For the first time, conditions in Europe could now change: The Digital Services Act (DSA), which has been fully in force since 2024, obliges large platforms to be more transparent about content moderation and effective complaints mechanisms. The CIJ sees this as an opportunity not only to uncover structural censorship, but also to tackle it legally.
In cooperation with the WHO and UNESCO, the CIJ is planning new studies on the digital suppression of sexual education worldwide. In the EU, the NGO is explicitly calling for platforms to disclose their moderation algorithms – especially when it comes to health content aimed at women.
Meta has already set up the Oversight Board, an independent body that can be contacted in the event of a dispute. It has already criticized Meta's guidelines on bare breasts on several occasions.
(dmk)