Why Vienna Museums Moved to OnlyFans & I'm Basically One Keyword Away From the Same Fate. Art Censorship in Social Media Exposed. 18+

Why Vienna Museums Moved to OnlyFans & I'm Basically One Keyword Away From the Same Fate. Art Censorship in Social Media Exposed. 18+

21 min readBy Demetrio
art history
social media censorship
content creation
shadowbanning
beauty fashion
algorithm moderation
digital art

"Violating Community Guidelines": When Algorithms Can't Tell Art From Harm

Every day, I get up, put on my tinfoil hat metaphorical beret & sunglasses, and make short videos about beauty, fashion, style, art, and culture. No gore, no porn, no trump, no hate speech—just vibes and art history.

And yet: my posts about the 1913 Armory Show, photographers Oliviero Toscani and Diane Arbus, austrian painter Oskar Kokoschka, filmmaker Pier Paolo Pasolini, and even Michelangelo (the Renaissance one, not the Ninja Turtle) have been flagged as "violating community guidelines," blocked, or quietly buried. My accounts on Instagram, TikTok, and YouTube go through these cursed "shadow‑ban seasons" where reach suddenly dies, hashtags don't show my posts, and videos mysteriously refuse to be boosted.

Eventually I realised it's not just the imagery. Certain words in my speech—"war," "racism," "sexuality," "AIDS," "death," "nude/nudity"—seem to set off some invisible alarm. The algorithm hears "art history" and apparently understands "red flag."

And the plot twist: this is not just a "me problem." It's a whole structural clown show. Only Fools and Horses could write this script.

When Museums Get Censored, You Know It's Not Just You

Before I drag my own account, let's look at who else the platforms are fumbling: literal museums.

  • In 2018, Facebook removed images of the Venus of Willendorf, a 30,000‑year‑old prehistoric figurine in Vienna, for "pornographic" nudity. Yes, a stone fertility figure survived thousands of years only to get cancelled by Face-freakin-book. The museum basically clapped back: "Let the Venus be naked!"[1][2][3][4]

  • The Museum of Art and History in Geneva tried to run ads featuring a semi‑nude Venus and a nude Roman captive for their "Caesar and the Rhône" show. Facebook blocked the ads and told them: "We don't allow ads that depict nudity, even if it isn't sexual in nature. This includes the use of nudity for artistic or educational purposes." Translation: the AI has no chill.[5][6]

  • A French teacher posted Gustave Courbet's L'Origine du monde and got his Facebook account shut down. After years of legal drama, a French court finally decided Facebook was wrong to censor him. Imagine going to court over a 19th‑century painting because the platform can't tell art from Pornhub.[7][8][9]

  • Boston's Museum of Fine Arts posted Imogen Cunningham (BTW, one of my fav photographers ever) nude photos from a legit exhibition; Instagram deleted them. The museum's curator said, "We felt this fight was long since over." Apparently, it's not.[10][11][12]

If the algorithm can't distinguish a prehistoric figurine, a Courbet, or a museum promo from "adult content," my little reel about Michelangelo never stood a chance.

Comic Sans Tea Alert

Demetrio Koro

Meta's AI is trained on billions of images, but it still mistakes classical statues for porn because it prioritizes 'brand safety' over context. Rumour has it they even have a secret 'nudity score' that flags anything with a 3D model of skin tones, even sculptures—hence why museums like Vienna had to start an OnlyFans to post Rubens and Schiele without drama. Wild.

My "Dangerous" Posts: Art History, But Make It Censored

So here's the stuff of mine that keeps getting slapped with "violates community guidelines" or just dies in silence:

  • A reel on the Armory Show (1913): talking about how it freaked out American audiences with European modernism, war anxieties, and new ways of showing the body.

  • A video on Oliviero Toscani, whose Benetton campaigns tackled racism, AIDS, and death—literally using advertising to critique society.

  • A short on Diane Arbus, whose raw portraits of freaks, transvestites, giants, and mental patients force us to stare at society's outsiders, loaded with themes of sexuality, death, and human weirdness that make normies uncomfortable.

  • A piece on Oskar Kokoschka, full of expressionist drama, violence, and romantic obsession.

  • A piece on Pier Paolo Pasolini, where I mention sexuality, Catholic guilt, fascism, and class.

  • A post about Michelangelo, focusing on the politics of the nude in classical sculpture.

These are all things you'd see in a museum, a film festival, or an art‑history class. But the captions and audio are sprinkled with exactly the kind of words that moderation systems hate: war, racism, sexuality, AIDS, death, nude.

1913: The Armory Show introduced the US to colors and shapes that did not make sense until they did

Oliviero Toscani: Photographer so scandalous this vid got removed 4 times

Diane Arbus: The Lens That Redefined Freak Chic and Authentic Glow

Oskar Kokoschka raw expressionism created the ugly-beautiful style vibe

Pasolini Raw Proletarian Chic: Beauty Born in Rome Slums

Born 1475: Michelangelo Divine Glow Still Slays Today

From my side, I'm like: "It's literally art history, babe." From the algorithm's side, it's like: "Alert: danger, conflict, sex, disease, death, nipples."

Comic Sans Tea Alert

Demetrio Koro

YouTube's demonetization lists are supposedly secret, but creators have reverse‑engineered them: words like 'AIDS', 'Afghanistan', 'Baghdad', and even 'arthritis' get flagged as 'sensitive' alongside porn and slurs. One YouTuber found a boat video demonetized because 'strip' sounded too sexy. The tea? They're not fixing it because advertisers love the overkill.

Want to see the "dangerous" art history content for yourself?

Follow me on Instagram, TikTok, and YouTube to catch all my daily shorts and reels—flagged ones included. You'll see exactly what trips the algorithms and why I refuse to stop talking about the messy, beautiful truth behind beauty and culture.

How Benetton Changed Advertising - Oliviero Toscani Provocative Ad Campaigns

Almanac: When modern art shook New York

Oskar Kokoschka: Bad Boy of Viennese Modernism

Michelangelo David - fine art or porn?

Masters of photography - Diane Arbus (documentary, 1972)

How Moderation Actually Works (Spoiler: It's Mostly Vibes And Keywords)

On Instagram, TikTok, YouTube, and Facebook, moderation isn't one wise curator thinking things through. It's layers of AI panic.

Roughly, it goes like this:

  1. AI watches and reads everything: images, video frames, auto‑captions, titles, hashtags.
  2. It assigns risk scores: nudity/sexual content, hate speech, "sensitive events" (war, terrorism, mass death), self‑harm, etc.[13][14]
  3. Other systems decide:
    • Do we recommend this to strangers or quietly keep it in a corner?
    • Do we give it ads, limited ads, or none?
  4. Humans come in later, maybe, if there's an appeal or a big scandal.

Instagram / Meta: "Art Is Allowed, Except When It's Not"

Instagram's rules say "nudity in photos of paintings and sculptures is OK," while also saying "we don't allow nudity on Instagram," with a few exceptions like breastfeeding. That's already a mess, and enforcement leans hard toward the second part.[15][16][17]

  • Museums and galleries have had nudes removed over and over, despite supposedly fitting the "art" exception.[6][11][12][5][10]
  • Feminist art, body‑positive work, anything with female nipples or pubic hair gets zapped constantly.[16][18][19][20][15]

And then there's the fun secret layer: even if your post isn't removed, it can be tagged internally as "not recommendable." Under the EU's Digital Services Act, Instagram now shows more of this behind‑the‑scenes rating, and artists are discovering that their content has been quietly blocked from reaching non‑followers for ages. That's the fancy, bureaucratic version of a shadowban.[21]

TikTok: Brand‑Safe First, Nuance Later (Maybe)

TikTok's rules around "Sensitive and Mature Themes" ban nudity and semi‑nudity and restrict non‑graphic sexual content from recommendation, with vague exemptions for art and education. The platform is obsessed with staying "family‑friendly."[22][23]

Investigations showed that TikTok:

  • Treated LGBTQ hashtags like "gay," "lesbian," and "transgender" in several languages the same way it treated "terrorist groups, illicit substances and swear words," making content under those tags unsearchable.[24][25][26]
  • Later admitted this was happening and called it a moderation error.[26][24]

So if I talk about sexuality, AIDS, or queer topics in an art context, the system might not understand "context." It just sees "not brand‑safe."

YouTube: The Adpocalypse Brain

YouTube is where you can see the keyword paranoia in HD.

After the "Adpocalypse," advertisers got tools to block entire categories like:

  • "Tragedy and conflict"
  • "Sensitive social issues"
  • "Sexually suggestive content"
  • "Profanity and rough language"[27]

"Sensitive social issues" explicitly includes wars, conflicts, tragedies, and sexual abuse, even in news, documentaries, or educational content. So if your title says "war," "racism," "AIDS," or "death," the system often treats you like you're not ad‑friendly.[27]

Robin Thicke - Blurred Lines (Unrated Version) ft. T.I., Pharrell Williams (aha, you can't see this here)

Creators and researchers noticed a pattern:

  • Once a video is demonetized, its reach from recommendations often falls off a cliff.[27]
  • A boat‑building channel got flagged because the word "strip" in "strip‑built kayak" looked sexual to the system.[27]
  • LGBTQ creators saw videos with words like "gay" or "lesbian" in the title demonetized, then magically fixed when those words were swapped out.[28][29][30]

So yeah—my art content is sharing vocabulary with categories the platforms were literally trained to get rid of.

Shadowbanning, But Make It Science

"Shadowbanning" is usually treated like a conspiracy theory, but we now have receipts.

A 2024 investigation by The Markup created a bunch of test accounts on Instagram and posted two types of content under the same custom hashtag:

  • Normal photos (landscapes, objects)
  • Non‑graphic war content (tanks, damaged buildings, no bodies)

Results:

  • War photos were 8.5 times more likely to be hidden from the hashtag page compared to normal photos.[31]
  • Only 9 out of 29 war images ever appeared under the hashtag, versus 158 out of 172 non‑war images.[31]

The users who posted the war images got no warning. Their posts just didn't show up publicly the way they expected. The same investigation found that Instagram sometimes:

  • Deleted captions with certain tags,
  • Made comments visible only to the commenter,
  • Marked political comments as spam with no way to appeal.[31]

Meanwhile, under the DSA (Digital Services Act, the EU's 2022 regulation that forces platforms like Instagram, TikTok, and YouTube to get transparent about moderation), artists are seeing new labels like "your content may not be recommended," which suddenly explains years of low reach. So when I say "I think my account is shadow‑banned," I'm basically describing documented behaviour.[21]

Comic Sans Tea Alert

Demetrio Koro

TikTok straight‑up admitted to 'shadow‑banning' LGBTQ hashtags like 'gay', 'lesbian', and 'transgender' in multiple languages, lumping them with 'terrorist groups' and drugs in their internal code. They called it a 'localised' anti‑porn measure, but creators say it's still happening. The real tea: platforms like this won't fix it until regulators force them.

When "Protecting Users" Means Muting Art, Health, and History

This over‑correction doesn't stop at art nudes. It catches health and activism too—exactly the stuff that often shows up in cultural content.

  • A New York clinic, Apicha, tried to run Instagram ads for HIV/PrEP awareness aimed at queer Asian and Pacific Islander men. Instagram rejected them as "political" social‑issue content and couldn't clearly explain why.[32][33]
  • A 2025 report by SMEX shows sexual‑health educators in Arabic having their posts about contraception, consent, or STIs removed as "pornography," so they resort to misspelling the word "sexual" to dodge filters.[34]

Rights groups call this "discrimination by moderation"—the idea that automated systems and biased flagging end up deleting or burying the speech of marginalised communities, even while actual hate often slips through.[35][13]

So when I talk about racism in fashion imagery, AIDS in Toscani's campaigns, sexuality in Pasolini, or death and war in modern art, I'm walking straight into that same minefield.

Being in a Spencer Tunick Shoot While Platforms Pretend Bodies Don't Exist

A personal highlight: last year siOsi team was part of a Spencer Tunick photoshoot. Imagine: hundreds of people (actually, around 1.000 which looks almost AI-generated), all shapes and sizes, nothing sexual about it, just a big collective "this is what human bodies look like without Photoshop."

Offline, it felt freeing and weirdly wholesome. Online, that same artist is repeatedly censored.

Spencer Tunick art installation in Granada, Spain, Retrato Alhambra 1925
Spencer Tunick art installation in Granada, Spain, Retrato Alhambra 1925
Spencer Tunick art installation in Granada, Spain, Retrato Alhambra 1925

On Don't Delete Art, Tunick describes self‑censoring his images—cropping, covering—only to have them deleted anyway, with threats that his whole account could be disabled. His reaction is understandably: "For artists to be threatened with disablement of their entire account, when they are properly self‑censoring their images, is maddening."

Thousands get naked for Spencer Tunick photo shoot on Story Bridge

Spencer Tunick, Dublin Installation

Spencer Tunick - Installation Dusseldorf

Stay Apart Together (Documentary) | Connection Through Isolation

Tunick is now a co‑curator of Don't Delete Art, a project that documents how social media censors art and pushes for saner policies. It's backed by PEN America's Artists at Risk Connection, the National Coalition Against Censorship, Freemuse, Article19, and others.[37][38][39][40]

They:

  • Run a virtual gallery of works banned or restricted by platforms.
  • Publish guides on how artists can reduce their risk of takedowns (cropping, pixelation, captions, appeals).[39][40][41][42][43]
  • Have a manifesto calling social media companies "cultural gatekeepers" with outsized power over what art gets to exist online.[44]
Free the Nipple x Don’t Delete Art | Lina Esco on Censorship, Art & Gender Equality

When I look at their gallery, what hits me is how normal a lot of the images are. If those are dangerous, then so is pretty much everything I talk about.

Why This Matters For Beauty, Fashion, and Culture Creators

On paper, my "violations" are cute:

  • The Armory Show video talks about how war, modernity, and new portrayals of the body shocked people in 1913.
  • The Toscani piece analyses how his Benetton campaigns confronted racism and AIDS instead of pretending everything is pastel sweaters and smiles.
  • The Pasolini content digs into sexuality, fascism, and death as part of film history—not as aesthetic clickbait.
  • The Michelangelo reel is literally Art History 101: why the nude body matters in sculpture.

But in platform logic:

  • "War," "racism," "AIDS," "death," and "sexuality" are sensitive events that advertisers want to avoid.[29][30][28][27]
  • "Nude" is automatically suspicious, even if it's marble or a fresco.[17][15][16]
  • Anything that looks like conflict or protest gets quietly sandbagged in hashtags and recommendations.[31]

So I end up in this weird position where I'm trying to teach people why images matter—while the platforms are trying very hard not to let those images or words be seen.

For beauty and fashion specifically, this has consequences:

  • Bodies that are too real (older, bigger, hairy, sick) get flagged more often than smoothed‑out "aspirational" imagery.
  • Talking honestly about racism, disease, or death in visual culture becomes career‑risking.
  • The algorithm nudges everyone toward soft, glossy content and away from anything that connects aesthetics to actual history.

Basically, social media is trying to make culture safe for advertisers, and everything else is collateral.

The Fear of Art: Reflections on Art Censorship and Banning

Censorship: The Cultural Impact of Silencing Artists

Instagram Censoring Boobs and the New Period Blood Emoji

So… How Do I Exist In This System?

Right now, my survival strategy is a mix of:

  • Learning the "cursed keywords" and sometimes sneaking around them.
  • Testing how far I can go with context, captions, and cropping.
  • Documenting every takedown and shadow‑ban cycle like a nerdy little case study.

But I also don't want to turn my whole feed into euphemism soup. If we can't say "war," "racism," "sexuality," "AIDS," "death," or "nude," we can't actually talk about art, fashion, and beauty in any honest way.

Projects like Don't Delete Art give me some hope. They're collecting evidence, sharing tactics, and publicly calling out platforms for playing museum, curator, and censor all at once.[38][40][41][43][37][39][44]

So I'll keep doing what I do:

  • Posting daily about beauty, fashion, art, and culture.
  • Saying the "wrong" words on purpose.
  • Treating every "violation" and shadow‑ban episode as another data point in how these systems actually work.

If platforms insist on being our new galleries, the least we can do is critique their curatorial taste.

Follow siOsi · Extra Virgin

Follow me on Instagram, TikTok, and YouTube to catch all my daily shorts and reels—flagged ones included. Beauty, fashion, art, culture, music, photography, cinema, and all things siOsi.

Extra Virgin is part of siOsi, an AI makeup lab based on the internet and occasionally in Granada.

Built, written and overthought by the founders of siOsi. Disclaimer: The views and opinions expressed in this article are those of the columnist and do not necessarily reflect the views of siOsi.

Related Articles