Is Instagram biased against plus-size bodies? Influencers think so
Plus-sized influencers have been complaining for a while about their posts being flagged on Instagram
In 2018, Katana Fatale posted a picture she took of herself in Hawaii. Fatale has been posting and advocating for plus-sized bodies on Instagram since 2014.
"It got amazing traction. I felt so beautiful. I remember I was just on the beach and I went to go check it again and I had this alert pop up that it had been removed,” she told BuzzFeed News.
While Fatale was nude in the photo, she had followed Instagram’s community guidelines that ban female nipples, sexual acts, genitals and close-ups of nude butts. As long as your photo does not show any of these things, it’s good to go. Instagram does allow images of breastfeeding, post-mastectomy scarring and nudity in paintings and sculptures though.
Fatale was confused as to why she was being told that further violations would lead to her account being taken away, especially when there were other women posting similar images with no issues whatsoever.
However, it is not just Fatale. There have been numerous reports of people like Fatale, other plus-sized influencers, who had their photos and videos flagged and removed from the platform. Famous women aren’t spared either. Performer Lizzo complained that TikTok was taking down posts of her in a bikini while other thinner women posting similar videos were getting away with it.
While there is no “hard data” that shows that images of plus-sized people get flagged more often than thin people, there have been enough anecdotes of it happening for influencers to see a pattern.
"It’s just too many. Where there’s smoke, there’s fire. There’s absolutely something going on where fat people are singled out," said Fatale.
Experts told BuzzFeed News that they are possibly right. “Content moderation on social media apps is usually a mix of artificial intelligence and human moderators, and both methods have a potential bias against larger bodies,” BuzzFeed News points out.
Mathieu Lemay, the cofounder of Lemay.ai which is an AI consulting firm, says that the “first thing to understand is that AI is far from perfect and, in fact, sort of lazy”.
"Technology and discrimination goes way back. Anytime you design a new project or a new prototype you have to think about how it is going to break,” Lemay said.
Companies like Facebook build their own proprietary image and video moderation AI “by feeding it millions of images so it can identify patterns and learn what is acceptable and what is not”. “It learns, for example, to identify pornography, or a nipple, or a bikini. As it scans images uploaded by users, it decides how likely that image is to contain banned content. If it's very sure, it can automatically flag the content. If it's only sort of sure, it can forward that content along to a human to double-check,” BuzzFeed News explains.
“The problem is there are so many gray areas, and the AI can only make its guesses based on what it's been taught. That's where the first potential problem arises. If the AI wasn't fed many images of plus-size women, which is a possibility given the bias against larger bodies in media, that could be the start of a problem.”
“The AI could not know the difference between a smaller, nude body and a plus-size body in a bikini,” Lemay pointed out.
"If you take two models, one plus-size one not plus-size, there's a chance there are more pixels related to skin," he said.
Since the AI doesn't know the context of what it's seeing, “this could lead to incorrect categorisation”. However, more importantly, these AI systems are built by people and people are biased.
Shoog McDaniel is a Florida-based artist who focuses on nude photos of large bodies. Like Fatale, McDaniel knows the rules — “nipples and genitals are covered, butts are only bare from a distance”. Yet, there have been times when” their posts being removed has been a weekly occurrence”.
"This was a trend that I think the community at large has been talking about for a while, which is that when these bots or whatever go in searching for nudity, it’s a percentage of skin compared to the rest of the body," they told BuzzFeed News.
"I think that that is a big part of it, and it’s a part of the systemic fatphobia that we face and it is completely unacceptable, but what can we do?" McDaniel said.
McDaniel said “they have never successfully been able to repeal an image takedown. Facebook, which owns Instagram, is tight-lipped on how its moderation process works”.
“We want our policies to be inclusive and reflect all identities, and we are constantly iterating on our policies to see how we can improve. We remove content that violates our policies, and we train artificial intelligence to proactively find potentially violating content," a Facebook spokesperson told BuzzFeed News.
"This technology is not trained to remove content based on a person’s size, it is trained to look for violating elements — such as visible genitalia or text containing hate speech,” the spokesperson added.
TikTok had a similar message. "TikTok is an inclusive platform built upon the foundation of creative expression — and of our users are held to the code of conduct outlined in our Community Guidelines," a spokesperson said.
But McDaniel is not convinced. "As it stands right now, more and more are being censored and work that is vital and life-giving is being taken down at a rapid rate," they said.
McDaniel also knows the “human element” of this “all too well”. Their work is “often subject to brigades of body-shaming trolls who report their images”.
However, platforms like Instagram are “certainly aware of this phenomenon and say the number of reports doesn't automatically trigger anything, but not everyone is so sure it doesn't play a part”.
Kat Lo, a researcher who studies content moderation and online harassment, said “reports may play some part in the larger machinery of moderation, but so do people”.
"That’s what so insidious about technological systems that are so big. There’s so many steps and so many opportunities for even small instances of bias to creep in," Lo said.
It's very possible that when an image of a larger person falls before a human moderator, they're more likely to mark it as "obscene" or sexualized than an image of a smaller person, Lo said, explaining that this was due to a bias present in society.
"There’s thousands and thousands of little reasons rather than broader reasons like 'I see a nipple,'" she said.
“Because of all these gray areas, and because of the sheer scale of these moderation databases, actually fixing a potential problem like this would be expensive and time-consuming, and companies have very little motivation to do it,” BuzzFeed News writes.
Lo pointed out that apps like Instagram or TikTok are under pressure to keep things PG, “both to keep themselves available on app stores, but also due to laws like FOSTA-SESTA or the resources it takes to remove terrorism-related content. It's just easier to err on the side of caution”.
And that can leave people like Fatale in limbo. "It makes me sad that Instagram can afford to totally ignore this issue," she said.
"I shouldn’t be silenced and erased because you are hypersexualising my body because it’s bigger. If they think this issue is going to go away, if they think the fat community is going to give up on this issue, they’re in for a long headache,” she added.Tags: Instagram