Instagram's recommendation algorithms push pro-anorexia and disordered eating content to millions of users, including those whose bios identify them as under 13 years old, according to a new report by Fairplay, an advocacy organization focused on children’s digital wellness.
The report states that nearly 20 million Instagram users are “fed content from Instagram’s Pro-Eating Disorder bubble,” and that many of them are teenagers or younger.
To understand the reach of pro–eating disorder content, researchers identified 153 “seed accounts” that were public, had over 1,000 followers, and expressly advocated for disordered eating. They calculated that approximately 1.6 million Instagram users followed one of these accounts, and 88,655 followed three or more. The researchers found that almost 20 million Instagram users followed at least one of those 88,655 accounts, and might be prompted to follow the seed accounts because they had a mutual connection.
“One of the things I was struck by was how profoundly easy it was to identify this pro–eating-disorder bubble,” said Rys Farthing, data policy director at the advocacy group Reset Australia and the leader of the research.
Farthing said that exposure to the content was primarily driven by Instagram’s suggestions about which users to follow. Test accounts that expressed an interest in weight loss or disordered eating were quickly flooded with recommendations from the platform to follow other users with these interests, including those that openly encourage disordered eating.
“All you would have to do is put some guardrails around that ‘follow recommendations’ algorithm and you could burst that bubble,” Farthing said.
In response to questions from BuzzFeed News about the Fairplay report, Meta spokesperson Liza Crenshaw said in a statement: “Reports like this often misunderstand that completely removing content related to peoples' journeys with or recovery from eating disorders can exacerbate difficult moments and cut people off from community. Experts and safety organizations have told us it’s important to strike a balance and allow people to share their personal stories while removing any content that encourages or promotes eating disorders.”
Crenshaw said via email that when users search for or post about pro–eating disorder content, the company highlights organizations that can provide help, and users have the option to specifically report content related to eating disorders. She added that accounts sharing self-harm content will not be recommended and Instagram is working to restrict search results related to known self-harm hashtags or accounts.
Researchers, journalists, and advocates have been raising alarms about disordered eating content on Instagram for years, culminating in fall 2021 when internal Facebook documents provided by whistleblower Frances Haugen showed that Instagram led teen girls to feel worse about their bodies. This new report shows that Meta’s struggles to curb this kind of harm are still ongoing.
But Farthing and others hope change may be around the corner: US Sens. Richard Blumenthal and Marsha Blackburn recently introduced the Kids Online Safety Act, which would create a duty for platforms to “act in the best interests of a minor” using their services. The California legislature is considering a similar provision, modeled after the UK’s Age Appropriate Design Code, that would require companies to consider children’s “best interests” when building or modifying their algorithms.
“If we can muster the courage to actually hold tech companies to account, we could get some of this legislation through,” Farthing said. “And maybe when we have this conversation next year, I might actually have put myself out of business.”