Instagram is testing a feature that automatically detects and blurs nude photos in direct messages sent to teens.
Parent brand Meta said the nudity protection feature would both show warnings about incoming photos and display a message reminding users that they shouldn’t be pressured to respond. Likewise, they’ll see messages encouraging kind, responsible behavior if they try to share any sensitive images of their own.
The feature will be enabled by default worldwide for teens under 18, and adults will be encouraged to turn it on.
The social media giant saw Instagram nudity protection as a way to protect teens against sextortion and other forms of image abuse. People will both be less likely to see unwanted nudes or fall for attempts to solicit intimate pictures, Meta added.
The company also took steps to prevent sextortion accounts from reaching users in the first place. DM requests from potential sextortion accounts now go directly to the hidden requests folder. Recipients won’t get notifications or otherwise have to see the content. Teens also won’t see the option to message one of these Instagram accounts, even if they’re connected, and will get safety alerts if they’re already in mid-conversation.
An additional test hides these accounts in people’s follower, following, and like lists. It’s also more difficult to search for teen accounts.
For those who’ve already engaged with accounts removed for sextortion, Instagram is testing pop-up messages that point users to an anti-sextortion information hub, support helplines, and tools to request takedowns. Child safety helplines are now coming into Instagram’s in-app reporting tools.
The additions come following previous efforts to limit harmful Instagram messages sent to teens. The social network already bars adults from messaging teens who aren’t connected, and in January limited DMs for under-16 users to their connections.
Meta is under significant pressure to crack down on sextortion and sexual abuse, and has been accused of being slow to respond to problems on Facebook and Instagram. Whistleblower Frances Haugen alleged that Meta often emphasizes user engagement over safety, supposedly implementing only minimal improvements.
Politicians and regulators have closely scrutinized Meta’s behavior in recent years. In addition to testimony before Congress from Haugen and more recent whistleblower Arturo Béjar, US officials have sued the company for allegedly harming teens’ mental health and put forward legislation to limit youth access. Florida, for instance, enacted a law in March requiring parental consent for teens joining social media services.