Influencers Say Instagram Is Biased Against Plus-Size Bodies, And They May Be Right

Plus-size influencers have long complained about their posts being flagged on social media, and there are a few reasons why it might be happening.

In 2018, influencer Katana Fatale was in Hawaii in an outdoor shower, and the light was just right.

She set her phone on a chair, set the timer, and took this photo.

She shared it with her thousands of followers on Instagram, where she's been posting and advocating for plus-size bodies since 2014.

"It got amazing traction. I felt so beautiful," she told BuzzFeed News. "I remember I was just on the beach and I went to go check it again and I had this alert pop up that it had been removed."

Although she was nude, she had followed Instagram's community guidelines, which ban female nipples, sexual acts, genitals, and close-ups of nude butts. Everything else, as long as it doesn't show these things, is OK to post. The company does allow images of breastfeeding and post-mastectomy scarring, as well as nudity in paintings and sculptures.

So Fatale was confused as to why she was now being told that further violations could see her account taken away, especially when other women seemed to be able to post similar images with zero issues.

Fatale is far from the only plus-size influencer to have this problem. There have been numerous reports of people like Fatale having their pictures and videos flagged and removed from social media.

Even very famous women aren't spared. Recently, Lizzo complained that a TikTok featuring herself in a bikini was pulled, while videos of thinner women did not appear to get the same treatment.

While there's no hard data showing images of plus-size people are flagged more often, there have been so many anecdotes of it that influencers can't help but see a pattern.

"It’s just too many. Where there’s smoke, there’s fire. There’s absolutely something going on where fat people are singled out," said Fatale.

According to experts who spoke with BuzzFeed News, it's very possible they are right. Content moderation on social media apps is usually a mix of artificial intelligence and human moderators, and both methods have a potential bias against larger bodies.

Mathieu Lemay is the cofounder of Lemay.ai, an AI consulting firm. He said the first thing to understand is that AI is far from perfect and, in fact, sort of lazy.

"Technology and discrimination goes way back," he told BuzzFeed News. "Anytime you design a new project or a new prototype you have to think about how it is going to break."

Companies like Facebook build their own proprietary image and video moderation AI. They build it by feeding it millions of images so it can identify patterns and learn what is acceptable and what is not. It learns, for example, to identify pornography, or a nipple, or a bikini. As it scans images uploaded by users, it decides how likely that image is to contain banned content. If it's very sure, it can automatically flag the content. If it's only sort of sure, it can forward that content along to a human to double-check.

The problem is there are so many gray areas, and the AI can only make its guesses based on what it's been taught. That's where the first potential problem arises. If the AI wasn't fed many images of plus-size women, which is a possibility given the bias against larger bodies in media, that could be the start of a problem.

Then, Lemay said, the AI could not know the difference between a smaller, nude body and a plus-size body in a bikini.

"If you take two models, one plus-size one not plus-size, there's a chance there are more pixels related to skin," he said.

Since the AI doesn't know the context of what it's seeing, this could lead to incorrect categorization. However, these AI systems are built by people, and people are biased.

Shoog McDaniel is a Florida-based artist whose focus is on nude photos of large bodies.

Like Fatale, they know the rules — nipples and genitals are covered, butts are only bare from a distance. And yet there have been times when their posts being removed has been a weekly occurrence.

"This was a trend that I think the community at large has been talking about for a while, which is that when these bots or whatever go in searching for nudity, it’s a percentage of skin compared to the rest of the body," they told BuzzFeed News.

"I think that that is a big part of it, and it’s a part of the systemic fatphobia that we face and it is completely unacceptable, but what can we do?"

McDaniel said they have never successfully been able to repeal an image takedown. Facebook, which owns Instagram, is tight-lipped on how its moderation process works.

“We want our policies to be inclusive and reflect all identities, and we are constantly iterating on our policies to see how we can improve. We remove content that violates our policies, and we train artificial intelligence to proactively find potentially violating content," a spokesperson told BuzzFeed News.

"This technology is not trained to remove content based on a person’s size, it is trained to look for violating elements — such as visible genitalia or text containing hate speech.”

TikTok had a similar message.

"TikTok is an inclusive platform built upon the foundation of creative expression — and of our users are held to the code of conduct outlined in our Community Guidelines," a spokesperson said.

These answers aren't very satisfying for McDaniel.

"As it stands right now, more and more are being censored and work that is vital and life-giving is being taken down at a rapid rate," they said.

McDaniel also knows the human element of this all too well. Their work is often subject to brigades of body-shaming trolls who report their images. Platforms like Instagram are certainly aware of this phenomenon and say the number of reports doesn't automatically trigger anything, but not everyone is so sure it doesn't play a part.

Kat Lo, a researcher who studies content moderation and online harassment, said reports may play some part in the larger machinery of moderation, but so do people.

"That’s what so insidious about technological systems that are so big. There’s so many steps and so many opportunities for even small instances of bias to creep in," she said.

Due to bias present in society, she said, it's very possible that when an image of a larger person falls before a human moderator, they're more likely to mark it as "obscene" or sexualized than an image of a smaller person.

"There’s thousands and thousands of little reasons rather than broader reasons like 'I see a nipple,'" she said.

Because of all these gray areas, and because of the sheer scale of these moderation databases, actually fixing a potential problem like this would be expensive and time-consuming, and companies have very little motivation to do it.

Lo said apps like Instagram or TikTok are under pressure to keep things PG, both to keep themselves available on app stores, but also due to laws like FOSTA-SESTA or the resources it takes to remove terrorism-related content. It's just easier to err on the side of caution.

And that can leave people like Fatale in limbo.

"It makes me sad that Instagram can afford to totally ignore this issue," she said.

"I shouldn’t be silenced and erased because you are hypersexualizing my body because it’s bigger," she added. "If they think this issue is going to go away, if they think the fat community is going to give up on this issue, they’re in for a long headache."


Topics in this article

Skip to footer