When a young woman followed a “thinspiration” account on Instagram and used terms associated with disordered eating, Instagram’s recommendation engine quickly suggested she follow other “thinstagram” accounts, which often showed images of underweight bodies.
Soon, the woman received unsolicited direct messages from a weight loss “coach,” an invitation to join a “pro-anorexia” (“pro-ana”) group chat, and requests from other users looking for “buddies” to hold them accountable to dangerous weight-loss goals.
Luckily, in this case, the young woman wasn’t real. Her account was the creation of two technology watchdog groups, Reset Tech and the Tech Transparency Project (TTP), to show just how easy it is for users, including teens, to encounter content encouraging disordered eating and self-harm through Instagram’s recommendations, search, hashtags, and messaging functions.
According to a report published Wednesday, the test account for the 29-year-old woman created six “thinspiration” posts, and followed a single influencer, Eugenia Cooney, who is popular in pro-ana circles online. (Cooney has more than 700,000 followers on Instagram, over 850,000 followers on TikTok, and more than 2 million followers on YouTube. She is verified on all three platforms.) Instagram spokesperson Liza Crenshaw confirmed that Cooney’s account does not violate the company’s policies and is recommendable to all users. Crenshaw added that the account “is often used to share stories of recovery.” Cooney did not respond to a request for comment.
Reset and TTP also created a second test account, representing a hypothetical 14-year-old girl. The account followed 100 pro-ana accounts, including users followed by the adult test account, as well as users recommended by Instagram. It also made six “thinspo” posts. Following these actions, the account received pro-ana recommendations in Instagram’s Explore tab, including multiple photos of emaciated women in their underwear.
The report from Reset and TTP was released just hours before Instagram’s chief executive, Adam Mosseri, was scheduled to testify before a Senate committee about the app’s effects on young users. It was released just hours after Instagram announced changes to the content that can be recommended to teens on the app and previewed parental controls that it plans to launch next year. It follows a bombshell report by the Wall Street Journal, and testimony by whistleblower Frances Haugen, about how Instagram’s parent company Meta (née Facebook) ignored internal research about the platform’s dangerous effects on teen mental health.
But the report also comes at a moment when eating disorder prevalence among teens has soared. Dr. Jason Nagata, a professor of pediatrics at the University of California San Francisco who treats teens hospitalized with eating disorders, told BuzzFeed News that his clinic saw more than a doubling in the number of teens needing inpatient care during the pandemic. Dr. Tracy Richmond, director of the Eating Disorder Program at Boston Children’s Hospital and professor of pediatrics at Harvard, said her clinic saw a similar increase.
The Reset/TTP report is the latest of several experiments that show Instagram promoting pro-ana recommendations to teen users. But the new report also shows how the platform has facilitated the creation of self-destructive online communities — and how it has failed to protect vulnerable members of those communities.
People with eating disorders often resist efforts by family members, doctors, and other care teams to urge them into recovery, and share strategies and “tricks” to facilitate that resistance. Nagata described incidents of hospitalized teenagers using the geolocation features on their phones to find patients in neighboring rooms and wards with whom they could exchange tips about how to hide food and deceive hospital personnel. S. Bryn Austin, director of the Strategic Training Initiative for the Prevention of Eating Disorders at the Harvard School of Public Health, called care evasion a “hallmark of eating disorders.”
But though Richmond used to hear concerns from parents that placing their children in residential care might lead them to “learn tricks” from other patients, she said that today, “kids are learning it all online.” Austin agreed — Instagram and other social networks have enabled vulnerable users to access self-destructive tips and techniques at a scale beyond what researchers have seen before.
Still, it’s not clear why the platform has done so little to stop them. The Reset/TTP researchers found that pro-ana users frequently prepare “backup” accounts with the expectation that their primary accounts will be disabled, and that they congregate around coded, misspelled, or alternately spelled hashtags to avoid detection by content moderation systems. But none of this is new. Harmful groups including QAnon, the boogaloo boys, anti-vaccine activists, and various militias used backup accounts and misspelled hashtags on Instagram and its sibling Facebook for years — giving the platforms plenty of experience in detecting and enforcing against their tactics.
The Reset/TTP researchers also found, though, that users vulnerable to pro-ana posts are often targeted in direct messages, and from there are sometimes urged to move from Instagram to encrypted chat apps like Kik and Telegram. And Instagram’s rules about DMs are markedly different from their rules about what can be said in a post or a comment.
“Because DMs are for private conversations,” Instagram says in a company blog post from February 2021, “we don't use technology to proactively detect content like hate speech or bullying the same way we do in other places.” This creates a powerful loophole for policy enforcement in DMs: Content must be reported by a user. Accordingly, users who support one another’s violations of platform rules (in group chats of white nationalists, for example, or groups of people with anorexia who want to help each other get thinner) are free to DM content that violates Instagram’s community guidelines without consequences. Instagram spokesperson Crenshaw confirmed that the company still does not scan direct messages for content that violates its community standards, even in messages sent to children, explaining, “we want to respect peoples’ privacy in DMs.”
A similar problem presents in Instagram’s approach to search restrictions. When users search for certain pro-ana terms on Instagram, they receive a message offering resources for users struggling with body image issues. Eating disorder specialists who spoke with BuzzFeed News were quick to laud this “redirect” initiative, but users who do not want those resources are free to dismiss the message and view the content that they searched for in the first place. So while the message may be helpful to a casual searcher, it likely cannot help the users at the highest risk of self-harm.
Experts also agreed that addressing pro-ana content on Instagram would require a more comprehensive approach than the platform has taken today. Nagata and Richmond both emphasized that cumulative exposure to “‘thinspo,” even less extreme varieties, plays an important role in patients’ trajectories. Lauren Muhlheim, an eating disorder specialist in private practice in Los Angeles, went further, saying that to protect users from eating disorders, platforms must also more forcefully target the “extreme bullying of larger-bodied people on these platforms.” Researchers at Reset and TTP found instances of bullying against larger-bodied users under a hashtag titled #reversethinspo. Richmond, though, was quick to spot a silver lining: More than ever before, she said, “there is health-at-every-size and body positive celebration on social media now.”
Still, “regulators must take action,” she said, “and platforms must take responsibility – to protect young people, at least, if not everyone.”
Emily Baker-White previously held policy positions at Facebook and Spotify.