After coming to realize that Instagram’s emphasis on ideal body images was hurting some users’ self-image and mental health, a group of researchers at the company hit on the idea of distracting users with nature images and humorous memes, among other measures. The lighter content might offset the damage, they reasoned.
The researchers also suggested that Instagram could emphasize posts with positive body issue hashtags, like #loveyourself, and images of average-sized or plus-sized models. Their work concluded that several core elements of Instagram were making the situation worse, and if the company wanted to take drastic action, it could consider limiting Likes or comments on a post—or even turn off its collection of photo filters, perhaps the app’s most famous feature. It’s unclear whether Instagram enacted any of these measures, though it obviously did not take the more dramatic ones, such as scrapping the filters.
While the Instagram team seemed undecided on the best course of action, it had reached definite conclusions about the problem at hand. “33% of Instagram users and 11% of Facebook users think the platform makes their own body image issues worse,” the report reads. “Substantial evidence suggests that experiences on Instagram or Facebook make body dissatisfaction worse, particularly viewing attractive images of others, viewing filtered images, posting selfies, and viewing content with certain hashtags.”
This 2020 report has not been previously reported and comes from the documents provided by Facebook whistle-blower Frances Haugen to the Securities and Exchange Commission, which were also provided to Congress in redacted form by her legal team. The redacted versions received by Congress were obtained by a consortium of news organizations including Forbes.
Haugen’s documents, which have become known as the Facebook Papers, present an enormously detailed picture of Facebook and its struggles to balance its size and growth with a constellation of problems on its apps. One of the issues made plainly evident by those documents is how Instagram harms the mental health of its users, who see an ecosystem of curated and highly edited images and then despair when their own lives, bodies and surroundings don’t resemble these photos. It is particularly worrisome given Instagram’s popularity with teens, a cohort already at risk of developing body image issues. (7.5% of its users are under 17, according to data from Statista, and over 40% are under 24.) Politicians have seized on this issue, hoping to find something easy for voters to understand that could perhaps lead to new regulation around social media. A Senate subcommittee has already heard testimony from Facebook’s Head of Safety, Antigone Davis, and Haugen, the whistle-blower. On Tuesday, it expanded its investigation further, hosting executives from TikTok, YouTube and Snap, three other apps favored by teens.
The 2020 report into body image jives with early reporting by The Wall Street Journal, which published the first stories based on Haugen’s documents. The Journal reported on additional internal reports showing that 32% of teen girls said Instagram made them feel worse about their bodies. Overall, 20% of teens told Instagram researchers that the app was making them feel worse about themselves. Facebook has dismissed nearly all of the reporting to emerge from Haugen’s leaks as overly sensational and a misunderstanding of Facebook’s efforts. Speaking to Wall Street analysts on the company’s earnings call last night, CEO Mark Zuckerberg said, “My view on what we are seeing is a coordinated effort to selectively use leaked documents to create a false picture about our company.”
That 2020 report on body image contains several other startling figures. It cited an earlier study that found 33% of respondents thought Instagram made their body image worse and 66% of teen girls on Instagram reported having some body images. Nearly of a quarter of those surveyed reported being bullied on Instagram and encountering some type of discrimination on the app. Almost 30% said Instagram worsened ending an in-person relationship, presumably because the app offered the ability to keep monitoring the other person.
While offering some possible solutions to the problem, the report also took the opportunity to highlight several ideas that researchers felt wouldn’t alleviate the issues. For example, researchers concluded that promoting positive captions wasn’t especially constructive: Doing so made some average-size users feel better but had little impact on overweight or thin users. They also concluded that attaching warning labels to harmful body-image content wasn’t helpful. This raises questions about the effectiveness of labels applied to content on other Facebook apps, a practice the company has increasingly turned to over the past two years. It has slapped labels on dangerous content related to the 2020 election, the pandemic and coronavirus vaccines, usually directing users to a portal of fact-checked information related to the topic.