California Attorney General Rob Bonta, leading a bipartisan coalition of 33 attorneys general, has filed a federal lawsuit against Meta Platforms, Inc., alleging the company designed harmful features on Instagram and Facebook that adversely affect children and teens’ mental and physical well-being.
All eyes will be on how the tech giant defends itself, especially considering six years ago, Facebook told advertisers it could identify when teens were feeling ‘insecure’ and ‘worthless.’
Filed in the U.S. District Court for the Northern District of California, the lawsuit accuses Meta of violating federal and state laws such as the Children’s Online Privacy Protection Act (COPPA) and California’s False Advertising Law (FAL).
The complaint outlines Meta’s focus on maximizing young users’ screen time and employing psychologically manipulative features. It also accuses the company of misleading the public about the safety of these features and failing to adequately address the harm caused by platforms designed to keep kids trapped in the endless scroll.
And Meta is hardly alone in claims around maximizing engagement and emotions for eyeballs — so other platforms like Instagram, Twitter and TikTok should watch the case carefully.
Connectivity and Isolation in the Digital Age
A quick look inside the windows of any family home will often reveal both kids and adults entranced by their smartphones, scrolling through social media feeds as if in a trance-like state. Families often sit in the same room, each engrossed in their digital world, creating a new form of isolation that ironically occurs in shared physical spaces.
Parents may lament their children’s tech addiction, but they too should look in the mirror. The “infinite scroll” is not just a teenage phenomenon; adults are equally guilty, often setting the wrong example for their kids.
The issue’s root extends beyond generational gaps to the design of social media platforms. These are just a few reasons why Meta stands accused of intentionally crafting addictive, psychologically manipulative features that keep users, young and old, hooked to their screens.
READ MORE: 70+ Insightful Social Media Statistics
The influence of social media on mental well-being varies drastically, especially among younger users navigating critical developmental stages. Experts recently learned that girls between 11 and 13 and boys between 14 and 15 have been found to experience a notable decline in life satisfaction following frequent social media use.
Unsurprisingly, adolescents in a sensitive period of brain development are more susceptible to the mental health risks posed by social media. Their brains, particularly the amygdala and prefrontal cortex, undergo changes that can increase sensitivity to social rewards and punishments, amplifying the emotional highs and lows experienced online. The heightened vulnerability of children makes the debate on ethical tech design and parental supervision more urgent than ever. But how did we get to this point?
The journey from BF Skinner’s behaviorist experiments to today’s social media addiction is a compelling narrative of how psychological principles have been deployed to capture and retain human attention. Skinner’s box, designed to control animal behavior through rewards and punishments, laid the foundation for a multibillion-dollar attention economy.
Fast forward to today, and we find a disturbing resemblance between Skinner’s work and the shiny slot machines in Vegas. These gambling mechanisms operate on a random reinforcement schedule, doling out small rewards at unexpected intervals to maintain user engagement—similar to how social media platforms like Facebook issue intermittent “likes” to keep users coming back.
The “pull-to-refresh” function and endless scrolling on social media are, in essence, digital slot machine levers designed to trigger the same addictive loop that keeps gamblers rooted to their seats. The very design of social media platforms is rooted in this addiction paradigm. In today’s digital economy, revenue is a direct function of sustained user engagement, translating to more clicks, more time spent on the platform, and more data for targeted advertising.
Facebook’s notification system, for example, is a digital Skinner Box. Likes and comments are issued not according to any set ratio but in a way that keeps users engaged and anxiously awaiting the next hit of social validation.
The architecture of these platforms—replete with “bright dings of pseudo-pleasure” as described by Justin Rosenstein, the creator of Facebook’s “like” button—is tailored to exploit human vulnerability to sustain consumer attention.
When users disengage, they are enticed back through notifications or offers, essentially ‘peppering’ them back into the cycle of addiction. The ramifications of this engineered addiction extend beyond wasted time; it contributes to the mounting concerns regarding mental well-being across all age groups.
Why Focusing Solely on Meta Overlooks the Growing Concern of TikTok Addiction
Although Meta is in the spotlight, there is also an inconvenient truth that children are leaving Zuckerberg’s apps and increasingly spending more time on TikTok. The video-sharing platform also employs a highly personalized, algorithm-driven system that feeds users content based on their preferences and activities. It creates a virtually irresistible feedback loop that encourages and practically enforces continual engagement with the app.
If you have ever lost an hour of your time scrolling through TikTok, make no mistake: this was not an accident. Given the array of potential adverse impacts on mental health diminished attention span, increased vulnerability to cyberbullying, and exposure to inappropriate content—the risks are exceptionally high for vulnerable groups like children and teenagers. Therefore, TikTok addiction is not merely a function of excessive screen time; it is a complex, multi-faceted issue requiring nuanced understanding and multi-disciplinary intervention strategies.
The divergent approaches to content on TikTok’s Chinese version, Douyin, and its international counterpart are also striking. Douyin serves educational and cultural needs, limiting children to 40 minutes of screen time and offering enriching content like science experiments and museum tours.
The international version of TikTok has raised red flags for its potential geopolitical influence, a concern highlighted by FBI Director Chris Wray, who warns that the app could serve as a tool for psychological operations against the U.S.
Is this disparity in content moderation and intent merely coincidental, or does it suggest a calculated dual strategy from a platform governed by an administration with objectives at odds with Western democracies?
A Developing Story of Mental Health Risks and Rewards
Notable spikes in anxiety and depression are alarming experts, prompting calls for actionable steps. New York has already proposed legislation that aims to curb the adverse effects of social media on young minds by offering parents the tools to opt their children out of algorithmic manipulation. The bill also introduces measures to limit screen time and enforce a digital curfew.
However, social media’s impact isn’t one-dimensional. Its effects on mental health are much more complex and nuanced, varying across individual characteristics and developmental stages. It’s essential to recognize that this is not just a developing story about protecting the mental health of our children. It also represents a wake-up call for everyone to re-examine and reconfigure the role of social media in their lives.
In doing so, we collectively challenge tech companies and ourselves to prioritize well-being over the allure of endless scrolling. Both parents and children must engage in this collective awakening, recognizing the need for healthier online habits and a more ethical digital ecosystem.
The mantra of “move fast and break things” has disrupted traditional business models. But it has also left a lingering impact on the mental well-being of society, particularly our vulnerable youth. As we grapple with lawsuits like the one led by California Attorney General Rob Bonta against Meta, we must acknowledge that social media platforms, designed to keep us endlessly scrolling, are not mere harmless indulgences but potent tools that can manipulate human psychology.
Platforms such as Facebook, Instagram, and TikTok have evolved into digital Skinner Boxes, exploiting our primal urges for social validation and rewards while leaving a trail of mental health issues, susceptibility to cyberbullying, and exposure to divisive or extremist content.
By leveraging algorithmic manipulation, the architecture of these platforms opens the door to targeted propaganda that can have profound implications, especially for young minds still in critical stages of development.
Social media’s utility and ubiquity have been mistaken for necessity, clouding our judgment in recognizing its inherent risks. Activating Monk Mode is a great start. But it’s time for a collective re-evaluation. The ethical responsibility for protecting mental health doesn’t solely rest on tech companies; it also requires active parental involvement and governmental oversight.
As more states, nations, and continents propose new legislation to curb algorithmic manipulation and reduce screen time, the imperative becomes clear: we must recalibrate our relationship with social media platforms to prioritize well-being over engagement metrics. Parents and children alike must become conscious digital citizens, educated not just in the functionality but in the ethics of these platforms, demanding a digital ecosystem that serves humanity, not the other way around.
What does this mean for the future of social media? It’s hard to predict the future when the new platforms, from Clubhouse to Threads, quickly fall out of favor with users. But the belated breakup and regulation of social-media apps comes when users of all ages are already falling out of love with promoting a fake version of their lives online, creating a viral post, or being called out for saying something inappropriate online. It’s exhausting, and everyone can see through the BS.
The good news is that younger users are not interested in self-censoring in digital spaces frequented by their parents and grandparents. They also want to avoid being bombarded with marketing messages from brands and influencers.
Social media as we know it is already dead, and users will increasingly hang out in group chats far away from algorithms and echo chambers, and that can only be a good thing. For these reasons alone, social media has far more to worry about than being sued by 33 states.