In the United States, dozens of states, school districts, and families have filed lawsuits accusing major social media companies of knowingly designing platforms that harm users’ mental health, particularly children and teenagers. In late 2023 and early 2024, Meta Platforms and TikTok faced mounting legal pressure over claims that their products promote compulsive use, worsen anxiety and depression, and keep young users engaged far longer than they intend.
These cases argue that the harm is not caused by isolated content or bad actors, but by the platforms themselves. Plaintiffs point to internal research, product testing, and design features that allegedly show companies were aware of the risks, yet continued to roll out systems that encourage constant scrolling, repeated checking, and prolonged screen time. Some lawsuits liken these design strategies to tactics historically used by gambling and tobacco industries, where profit depended on habitual consumption.
While outcomes vary — TikTok, for example, has pursued settlements in some cases — the lawsuits collectively signal a shift. Regulators and courts are no longer asking whether social media can be addictive in a casual sense. They are asking whether addiction-like behavior was anticipated, engineered, and monetized.
That question leads directly to the business model behind these platforms.
The system behind the scroll
To understand these lawsuits, it helps to understand the business model they are targeting.
Social media platforms operate in what is often called the attention economy. In simple terms, the longer users stay on a platform, the more ads they see — and the more money the platform makes. Engagement is not just a metric; it is the product.
This creates a powerful incentive structure. Platforms are rewarded for keeping users scrolling, watching, and reacting for as long as possible. Features like infinite scroll remove natural stopping points. Recommendation algorithms learn what content triggers emotional responses — excitement, outrage, validation — and serve more of it. Notifications arrive at carefully timed moments, nudging users to return even when they did not intend to.
In this context, excessive use is not a glitch. It is a predictable outcome.
Why the lawsuits use the word "Addiction"
Legally, the term addiction carries weight. It signals loss of control, harm, and corporate responsibility. Sociologically, however, the issue is more precise: these platforms are engineered to be habit-forming.
Users are not simply failing at self-control. They are interacting with systems designed to exploit well-documented psychological tendencies, such as variable rewards and social validation. When a user says they planned to scroll for five minutes and lost an hour, that experience is not unusual — it is expected.
This is why the lawsuits focus on internal company research, product testing, and design decisions. The argument is not that people should never use social media, but that companies knowingly built systems that encourage compulsive use, especially among minors, while downplaying or obscuring the risks.
From personal responsibility to corporate accountability
For years, public discussion around social media harm has centered on individual behavior: screen time limits, parental control, and digital detoxes. These lawsuits challenge that framing.
If platforms profit from prolonged engagement, and if that engagement is driven by design features that exploit attention, then responsibility cannot rest solely on users or parents. The legal question becomes whether companies should be held accountable for creating environments that make disengagement difficult by design.
Some proposed remedies go beyond financial penalties. They include greater transparency around algorithms, design changes that introduce friction instead of endless flow, and stronger protections for younger users. Whether courts will mandate such changes remains uncertain, but the cases mark a shift in how social media harm is understood.
Why this matters locally
In countries like the Philippines, where mobile-first internet use is widespread and young people make up a large share of social media users, these issues are not abstract. Platforms like TikTok and Facebook are deeply embedded in everyday life—for entertainment, socialization, and even news consumption.
As global lawsuits question the ethics of attention-driven design, they also raise an uncomfortable possibility: that many users are navigating digital spaces optimized for profit rather than well-being, in regulatory environments that are still catching up.
The outcome of these cases may not dismantle the attention economy overnight. But they do signal a growing recognition that what feels like a personal habit may, in fact, be a carefully engineered system — one that is finally being asked to explain itself in court.