The 21-page report, titled Designing for Disorder: Instagram’s Pro-eating Disorder Bubble, focuses on the platform’s algorithms that help determine what photos, posts and videos Instagram users get recommended. These algorithms, Fairplay said, shape the creation of “bubbles” that include content that promotes eating disorders.
“This bubble is also undeniably harmful. Algorithms are profiling children and teens to serve them images, memes, and videos encouraging restrictive diets and extreme weight loss,” the report said.
Fairplay’s report is the latest in a body of research that suggests Instagram, a photo-and-video service popular among teens, is failing to protect some of its youngest users from harmful content. The advocacy group is pushing for more regulation, noting that bills such as the Kids Online Safety Act and other proposals that address how these platforms are designed “are long overdue.”
Instagram, owned by Facebook’s parent company Meta, said it wasn’t able to fully address Fairplay’s report because the authors declined to share it with them. “Reports like this often misunderstand that completely removing content related to peoples’ journeys with or recovery from eating disorders can exacerbate difficult moments and cut people off from community,” a Meta spokeswoman said in a statement. The company said it’s trying to strike a balance between removing any content that encourages or promotes eating disorders and allowing people to share their own personal stories.
Meta added it directs users to resources such as talking with a helpline when people search for body image-related topics and users can report content that promotes eating disorders.
As part of the research, Fairplay looked at people who followed 153 public Instagram profiles with more than 1,000 followers that posted content that normalizes, celebrates or promotes eating disorders and extreme weight loss. Some of the accounts posted images of skinny women with protruding bones or posts about eating 300 calories a day. The group estimates about 20 million individuals or unique users followed and received content from Instagram’s pro-eating disorder bubble. The group conducted the research between December 2021 and January 2022.
Even though Instagram says users have to be under 13 years old to use the platform, Fairplay said it was able to identify 21 users in the pro-eating disorder bubble who identified themselves as under 13 years old, including users as young as 9 years old. In the US, the median age of Instagram’s pro-eating disorder bubble is 20 years old and one quarter of users in the bubble say in their profiles they are minors.
Meta said it will use information such as a user’s bio to identify and remove users who are under 13 years old. It’s unclear from the report if the underage users were reported to Instagram.
Using data about the average revenue Facebook makes per person, Fairplay estimates that Meta generates $227.9 million in revenue per year from users who follow this pro-eating disorder bubble. Meta doesn’t share how much it makes from Instagram users so Fairplay notes that it believes its estimates are “conservative.”
The report also included testimony from an unidentified 17-year-old high school student in Southern California who says social media platforms such as Instagram and short-form video TikTok are filled with content that glorifies eating disorders or unhealthy ways to lose weight. The teenager, called Kelsey in the report, outlines some of the harmful content she’s seen on social media, such as trends that encourage users to show their “side profile” and beauty filters that make people look thinner.
“As someone who had grown up with Instagram, it’s hard not to imagine a time when the app didn’t have the sort of content that promotes disordered eating behavior. I felt like my feed was always pushed towards this sort of content from the moment I opened my account,” the teenager said in the report.
Instagram has faced increased pressure to improve child safety after former product manager turned whistleblower Frances Haugen leaked internal research she said shows the company puts profits over user safety. One of the leaked documents, which was first reported by The Wall Street Journal and viewed by CNET, said that 32% of teen girls who felt bad about their bodies felt like Instagram made them feel worse.
Instagram said its internal research was being mischaracterized and pointed out that the platform can also help teenagers connect with family and friends.
Lawmakers, though, have also conducted their own research that raises questions about how well Instagram is enforcing its rules against promoting, encouraging or providing instructions for “self-injury,” which includes content about eating disorders. Last year, during a hearing about protecting kids online, Sen. Richard Blumenthal, a Connecticut Democrat, said his office created an Instagram account that identified itself as a 13-year-old girl. Instagram recommended eating disorder and self-harm content to the account, he said.
Since then, Instagram has been taking steps it says will give parents more control over the content their teenagers see. In March, the company started rolling out parental controls in the US that allow parents to set limits on the amount of time their teenagers are spending on Instagram. The platform also has resources listed online for people dealing with an eating disorder.