According to The Wall Street Journal, Instagram was found recommending sexually explicit Reels to teenagers as young as 13.
Over seven months ending in June 2024, the WSJ and Northeastern University professor Laura Edelson separately carried out tests that involved creating new Instagram accounts with the user age set to 13.
From the moment these accounts were created, moderately sexual videos began to be served to teens, with the first racy content arriving just three minutes after the new account was opened. This content focused on women’s bodies or showed sensual dancing.
Accounts that watched these videos were then served recommendations for even more explicit content, arriving within 20 minutes of the account opening, including women mimicking sex acts and users promising nudes to anyone who commented.
The algorithm behind Instagram’s Reels partly works by assessing user interests based on the videos they watch most, recommending similar content.
Edelson and the WSJ carried out the same tests on Snapchat and TikTok. Still, neither recommended sexual content to the teen accounts. Even searching for sexual content and following creators producing it failed to result in recommendations for racy videos. Edelson commented that TikTok’s adult content seemed far less explicit than content served to teens on Instagram Reels.
Meta’s Already Aware of the Instagram Problem
According to the WSJ, this is not the first time Meta has identified this issue. In 2021, the company’s safety team performed the same test with similar results. At the time, Meta spokesperson Andy Stone downplayed the results, saying that the experiment didn’t reflect how teens use Instagram. He also claimed that Meta has “established” steps to reduce the mature content teens may encounter on the platform.
In 2022, the WSJ reviewed an internal analysis from Meta that showed the social media giant has long been aware that Instagram shows more adult content, such as gore and pornography, to younger users than adults. According to the analysis, teens saw three times as many prohibited posts containing nudity as users over 30.
Meta’s guidelines state that sexually suggestive content should not be recommended to any users, regardless of age, unless it comes from accounts they’re already following.
Instagram’s Privacy Updates for Teens
In January, the company introduced privacy updates to protect younger users, including adding its most restrictive control setting to teen accounts by default. Following these privacy updates, teens under 16 shouldn’t be shown any sexually explicit content at all.
In April, Instagram began testing a feature that automatically blurs nude photos sent to teens via DM. In May, the platform’s Limits feature introduced a default setting to limit teens’ interactions to Close Friends, muting other users to reduce bullying and harassment on the platform.
The experiments carried out by the WSJ and Edelson continued after the implementation of these updates, which may suggest that Meta still isn’t doing enough to keep younger users safe.