EU Investigates Facebook, Instagram Over Possible Addictive Effects On Children

Why Trust Techopedia
Key Takeaways

  • The EU is investigating Meta over possible addictive qualities in its social networks.
  • Facebook and Instagram may be "rabbit holes," according to regulators.
  • There are also probes into age checks and child safety.

The European Union has launched a formal investigation to determine whether or not Meta is fostering addictive behavior in children through Facebook and Instagram.

The European Commission was worried that Meta algorithms and other systems not only “stimulate” addiction, but produce a “rabbit hole” effect where kids dive deeper and deeper into content. This risks violating their rights to mental and physical health, the Commission said.

Officials were also concerned that Meta’s limits on child access, including age checks, might not be “reasonable, proportionate and effective.” They likewise wanted to be sure that Meta was meeting privacy and safety requirements for child accounts.

If the Commission investigation finds fault, Meta would violate as many as three sections of the Digital Services Act (DSA). The EU law requires that online intermediaries like Meta both pull content that harms children and limit the collection of data.

In a statement, a Meta spokesperson said the social media giant wanted children to have “safe, age-appropriate” experiences and had over 50 policies and tools to guard them. The company added it “look[s] forward” to sharing what it does with the Commission.

There have been longstanding worries that social network algorithms from Meta and other companies may encourage addiction by surfacing the most attention-grabbing content rather than taking a more neutral approach. And while Facebook and Instagram officially require that users be at least 13 years old to sign up, it can be relatively easy to dodge that requirement.

The formal probe came just weeks after the Commission began an investigation into Meta over its approach to fighting election disinformation. The regulator feared that Meta didn’t have strong enough moderation to prevent advertisers from deceiving voters.

There’s no guarantee this latest case will lead to action against Meta. The Commission warned that it could crack down if there are DSA violations, however, including “interim measures” and securing promises of fixes. Meta might have to alter its algorithms and child safeguards.