The US Federal Trade Commission (FTC) has determined that social media giants are engaged in “vast surveillance” of users that threatens privacy and safety, particularly for children.
The staff report found that Meta (owner of Facebook, Instagram, and WhatsApp), TikTok, X, YouTube, Reddit, Snapchat, and Twitch all gather more data than necessary and are involved in “broad” data sharing that could pose problems for their handling and oversight.
All the companies’ approaches to gathering, minimizing, and deleting data were “woefully inadequate,” according to the FTC. They would keep some info indefinitely, including material they’d obtained from data brokers. In some cases, the platforms didn’t delete all user data after a request.
The investigators also concluded that social media firms’ monetization of the data was at odds with user privacy, such as hidden pixels on websites that track user activity for ads. Users often had “little or no way” to opt out of data collection for automated systems like feed algorithms or AI training.
None of the social media platforms did enough to protect the privacy and safety of children and teens, the FTC argued. While many of the companies insisted there were no kids on their platforms, they frequently treated teens like adults and let those younger users on to their services with no account restrictions.
The practices could lead to market domnance and “harmful” practices, the Commission said. That, in turn, would give users limited choices for places to go if they don’t like a given service’s policies.
The FTC made several recommendations to companies, politicians, and regulators. It called on Congress to pass federal privacy legislation that limited surveillance and gave people data rights. The agency also asked for reduced data collection and sharing with firm policies that are clear to users, and give those people more control over how their data gets used.
Social media heavyweights were also told to not “ignore the reality” that children use their platforms, and that the Children’s Online Privacy Protection Act (COPPA) should serve as the bare minimum for child safety, with more efforts needed. Congress should pass a federal law that protects privacy for teens over 13, the FTC added.
We’ve asked Discord, Meta, Reddit, Snap, TikTok, Twitch, and YouTube for comment, and will let you know if we hear back. X (formerly Twitter) no longer has a communications team.
The FTC report is based on December 2020 orders for information from the companies about their data collection, use, and sharing policies.
However, a lot has changed in the ensuing four years. In 2021, Meta whistleblower Francis Haugen shared reams of documents detailing the social media giant’s approaches to privacy and safety, raising concerns that it was putting people (especially children) at risk. That and broader pressure from legislators, researchers, and critics led to Meta and other companies rethinking their strategies.
At least some of those companies have since tightened their privacy controls and limited what children and teens can do. Just two days prior to the report, Instagram launched teen accounts that have strict limits enabled by default. As such, the FTC might be making some recommendations that have already been addressed.
There are still issues, however. Age verification has become a sore point, for instance. Some state governments have pushed for laws requiring age checks, but critics and judges have blasted them for allegedly violating free speech. The FTC findings could spur federal laws that govern child access and other areas that haven’t been fully addressed.