She sits in a darkened room, face illuminated only by the constantly changing colors and flashing lights of the screen. She has lost all sense of time. The occasional “ding” gives her the momentary dopamine hit she’s been chasing for hours, but still she craves more. So she keeps going, hoping for the big payout she knows is coming if she just sticks with it a little bit longer.
She isn’t sitting on a casino stool, though. She’s on her couch, lost in an endless scroll engineered with the very same psychological tricks that keep gamblers hooked.
The vibrant colors and stimulating sounds designed to overwhelm and light up sensory receptors in the brain; the likes, comments, shares, and follows that act as emotional currency and generate constant micro‑rewards; the infinite scroll and frictionless interfaces to ensure there is never a natural stopping point; the quiet erasure of time awareness by feeding users an endless stream of algorithmic content. And that false sense of privacy that assures her no one truly knows how much she’s investing (losing, for most) in this effort.
In casinos and social media alike, these design choices are not accidental; they are engineered using behavioral science and psychology to capture and keep users online for hours; each engagement registering as money in the bank for the companies that designed them.
This is essentially the argument attorneys representing 19-year-old “KGM” are presenting to the jury in a bellwether case whose opening statements began Monday in Los Angeles.
KGM’s lawsuit against Meta (parent company of Facebook and Instagram) and Google (parent company of YouTube) argues that these tech giants intentionally designed their platforms to addict and harm children. Internal documents cited in opening statements allegedly show that Meta and YouTube saw young children as target audiences. TikTok and Snapchat both settled with the plaintiff before the case went to trial.
This is the first successful attempt to sue social media platforms; other cases have been dismissed under Section 230 of the Communications Decency Act. This case was allowed to proceed because the companies are being sued, not based on the content they curate or allow, but on the claim that these companies engineered engagement-driven features, such as infinite scroll and “like” buttons, specifically to exploit the psychological vulnerabilities of minors.
Both companies deny wrongdoing, pointing to safety tools and policy changes over the years.
The stakes extend far beyond one young woman’s story. This trial will help shape how thousands of similar lawsuits proceed nationwide, as states, school districts, and families seek to redefine what legal responsibility Big Tech bears for youth mental health and online safety.
Executives, including Mark Zuckerberg, are expected to testify, and the outcome could determine whether tech firms can continue relying on First Amendment and Section 230 shields or if courts will carve out new liability for product design itself. Meanwhile, other states and countries are already moving ahead with new restrictions on youth social media use. Whatever the jury decides, this case signals a turning point: the question is no longer whether social media harms kids, but who should be accountable and what reforms should follow.



