Meta: Guilty of designing and selling addiction

This Article is sponsored by:

For more than 5 years in this academic space, I have been denouncing the unethical and questionable behavior of the META corporation.

On March 25, 2026, a jury in Los Angeles did what the U.S. Congress never could: prove what many academics and journalists have been warning about for a decade: Meta is guilty of “intentionally designing and selling platforms that destroyed mental health.” The case is called K.G.M. v. Meta Platforms, Inc. & YouTube LLC” and its severity is comparable to that of the tobacco industry.

It was not a millionaire sentence in terms of what these companies bill. Six million dollars, the compensation awarded, is what Meta earns in just over an hour. But that doesn’t matter too much.

What matters is what the verdict sets as legal precedent: For the first time, a jury treated social media as defective products, with the same legal logic that for decades applied to the tobacco industry. Behind this case are more than ten thousand lawsuits pending with potential exposure that some experts estimate in the hundreds of billions of dollars.

The court that dared to change history

By: Gabriel E. Levy B.

From the notorious Cambridge Analytical case, through the Facebook Papers and the revelations of Frances Haugen, I have brought to the attention of readers all kinds of questionable actions of this corporation, which are summarized in the articles “Facebook and Global Polarization” published in June 2020, “The Moral Bankruptcy of Facebook” published in October 2021 and “Facebook and Instagram:  When the Goal is espionage” published in October 2025,

This complaint cost me censorship attempts by Meta, this happened while I was moderating a panel at Andinalink 2025, the precise moment when the algorithm of this corporation definitively blocked my personal accounts of: WhatsApp, Instagram and Threats, as has happened to many journalists and media around the world who have dared to question,  perhaps the most notorious and well-known case being that of journalist Dave Kendall, of the Kansas Reflector media, whose accounts were blocked by Meta after a publication that questioned Facebook’s algorithms.

A corporation with highly questionable practices

From interfering in elections, to affecting the mental health of children, adolescents and young people, Meta is a corporation with questionable ethical behavior, which are no longer simple complaints, it is the ruling of the independent justice of a United States court, after a trial with all the guarantees, in which this corporation used the best and most expensive lawyers in California,  to try to win a case that reveals that it is a corporation as questionable as the big tobacco companies in the world.

The verdict was issued in Los Angeles County Superior Court, under the direction of Judge Carolyn B. Kuhl.

The case was part of the Coordination procedures of the Judicial Council of California, a mechanism that groups hundreds of similar lawsuits and that designated this as the pilot trial, the one that sets the legal framework to evaluate all the others.

The plaintiff, identified as K.G.M. and called “Kaley” during the process, is currently 20 years old.

She filed her lawsuit when she was 17, along with her mother. Her story, which the jury heard for seven weeks, is at the same time exceptional and sadly common: she started using YouTube at age 6, Instagram at age nine. Before finishing primary school he had already published 284 videos.

She spent up to 16 hours straight on Instagram in a single day. She developed severe anxiety, depression, body dysmorphia, self-harming behaviors, and suicidal thoughts.

He hid in the school bathrooms to check notifications.  He gave up his hobbies.

He was moving away from his family. Today she works as a personal shopper at Walmart and lives with her mother.

The jury deliberated for nine days, accumulating more than 43 hours of discussion.

In its verdict forms, it answered affirmatively to seven key questions for each company: that they were negligent in the design of their platforms, that that negligence was a substantial factor in the harm caused, that they failed to adequately warn of the dangers, that they knew or should have known that their products were dangerous to minors, and that they acted maliciously,  oppression or fraud, which enabled punitive damages.

The compensation was divided into three million compensatory and three million punitive. Meta took 70% responsibility and Google 30%.

The Addiction Mechanisms the Jury Found Defective

The legal strategy of the plaintiff team, led by attorney Mark Lanier, was surgical. Instead of attacking user-generated content, protected by Section 230 of the Communications Decency Act, they focused on product design.

They argued that the platforms functioned as a digital casino, and that the companies had borrowed behavioral and neurobiological techniques from slot machines and from the marketing of the cigarette industry.

Justice Kuhl validated that approach in November 2025 by distinguishing between publishing features, protected by law, and design features such as notification timing, hooking loops, and the absence of effective parental controls.

The jury identified nine specific mechanisms of deliberate addiction:

  • Infinite scroll: endless power that encourages prolonged use without natural stopping points.
  • Video autoplay: Continuation of content without user intervention.
  • Push notifications: Designed to repeatedly lure the user back to the platform.
  • Recommendation algorithms: which create loops of progressively more extreme content.
  • Beauty filters: which manipulate the user’s appearance, contributing to body dysmorphia.
  • Like counts and social validation metrics: which generate dopamine-based feedback loops.
  • Variable reward systems: which exploit the neurological pathways of dopamine such as gambling.
  • Absence of effective parental controls: which allow minors to evade supervision.
  • Lack of actual age verification: which allows children under 13 to create accounts freely.

Dr. Anna Lembke, an expert in addiction medicine at Stanford University and author of Dopamine Nation, testified that social media reward mechanisms activate the same neurological dopamine pathways as gambling and substance addiction, with adolescents being particularly vulnerable for having the prefrontal cortex still developing.

The internal tests that sank technology companies

The companies’ internal documents were the backbone of the case, and they connect directly to what Frances Haugen leaked in 2021.

During the trial, evidence was presented that revealed that the companies not only knew about the damages, but had calculated them and decided to ignore them.

An internal Meta memo read: “If we want to win big with teenagers, we must recruit them as tweens.”

Internal data showed that 11-year-olds were four times more likely to keep coming back to Instagram compared to competing apps.

A study called Project Myst found that minors who had experienced adverse effects were the most likely to become addicted, and that parents were powerless to stop it.

Meta’s internal communications compared the platform’s effects to “selling drugs and gambling.”

One Instagram employee wrote that they were “basically dealers.”

Mark Zuckerberg testified in person on February 18, 2026. He said keeping users safe was “always a priority” and acknowledged, “I always wish we’d gotten there sooner” with security tools.

Adam Mosseri, head of Instagram, stated that social media use can be “problematic” but not “clinically addictive.”

The 2021 Facebook Papers: The Prophecy That Was Fulfilled in a Court

There is something that makes this verdict more resonant than any other: we already knew what he was going to say. We’ve known this since 2021, when Frances Haugen, Facebook’s former product manager, leaked thousands of pages of internal documents that the Wall Street Journal published as “The Facebook Files.”

Those documents contained exactly what the California jury ended up convicting, five years in advance.

The most pertinent revelations were devastating: Instagram’s internal research concluded that “we worsened the body image problems of one in three teenagers”.

32% of teenage girls who felt bad about their bodies said Instagram made them feel worse.

In the UK, 17% of teenage girls reported that Instagram aggravated their eating disorders and 13.5% said it exacerbated their suicidal thoughts. More than 40% of teens felt they were “unattractive” after they started using the app.

Haugen testified before the U.S. Senate in October 2021 with a phrase that was etched in the collective memory: “I’m here because I believe that Facebook’s products harm children, foment division, and weaken our democracy.” He explained that Facebook was “buying its profits with our security.” Years later, during the 2026 trial, he stated that tech executives cared about teenagers only as potential new users: “They were concerned about public perception, not the actual health of children.”

What in 2021 was the solitary complaint of an informant, in 2026 became the basis of a historic verdict. The trial transformed those leaks into formal judicial evidence and gave them the weight that the political system had not wanted to give them.

“The Big Tobacco Moment of Technology”: Reactions That Shook Washington and Wall Street

The most repeated expression after the verdict was inevitable. Senator Ed Markey declared, “Big Tobacco’s big tech time has arrived.”

Senator Marsha Blackburn demanded the passage of the Child Online Safety Act.

Sen. Josh Hawley called for reforming Section 230, noting that ordinary jurors had done what Congress dared not do.

Congressman Jimmy Patronis introduced a bill to completely repeal that rule that for decades protected platforms from lawsuits for damages.

Meta and Google announced that they will appeal.

Meta stated that “adolescent mental health is deeply complex and cannot be linked to a single app.” Google claimed that the case “misinterprets YouTube.” Its lawyers did not convince the market: Meta’s shares fell by almost 8% on March 27 and another 2.4% the next day. Those of Alphabet, Google’s parent company, fell 3% and then another 1.3%.

Jonathan Haidt, author of The Anxious Generation, warned that total exposure could reach hundreds of billions of dollars, enough to force real changes in corporate behavior. George Washington University law professor Mary Franks summed it up accurately: “The era of impunity is over.”

The massive legal ecosystem behind the pilot case

The K.G.M. case is just the visible tip of a colossal legal iceberg.

There is parallel federal multidistrict litigation, MDL No. 3047, in the Northern District Court of California that groups nearly 600 additional federal cases, including lawsuits from more than 250 school districts and actions from 41 state attorneys general plus the District of Columbia against Meta.

The first federal pilot trials are scheduled for June 15, 2026. In total, more than 3,000 lawsuits are pending against Meta, YouTube, Snapchat, and TikTok.

The comparison with the tobacco industry is not empty rhetoric.

The 1998 Framework Tobacco Settlement Agreement cost tobacco companies $206 billion over 25 years and transformed an entire industry. The parallels are structural: both industries possessed internal research showing that their products caused harm, publicly downplayed risks, designed products to maximize addictive potential, and deliberately targeted their marketing to younger people.

With one significant difference: the tobacco industry lost more than 800 cases before the tide turned. In the case of technology, the first pilot trial has already resulted in conviction.

TikTok and Snapchat reached confidential out-of-court settlements in January 2026, before the start of the trial.

Snap, with a market capitalization twenty times smaller than Meta, is particularly vulnerable; Their “streak” function was specifically cited as an addictive mechanism.

The global context: an unprecedented regulatory wave

The Californian ruling does not occur in a vacuum. It is part of a global regulatory wave that has been building for years and has gained definitive momentum in recent months.

Australia banned children under the age of 16 from social media in December 2025. France passed a ban on children under 15.

The European Parliament voted in favor of a minimum age of 16 for the entire European Union and a ban on addictive features such as infinite scrolling and autoplay.

The UK’s Online Safety Act came into force in March 2025 with fines of up to 10% of global turnover. At least 42 countries are considering or have already passed age restrictions for social media.

In the United States, more than 45 states have proposed relevant legislation. California, New York, Florida, Georgia, Connecticut, Tennessee, and Nebraska have laws in place that restrict addictive features or require parental consent.

The Federal Trade Commission finalized in January 2025 a comprehensive reform of the Children’s Privacy Protection Act, COPPA, requiring explicit parental consent for advertising directed to minors.

A pattern that many academics have been denouncing

The pattern of concealment that the trial exposed is neither new nor accidental. In 2021, researcher Kelley Cotter of Pennsylvania State University published a study in the academic journal Information, Communication & Society that accurately documented how Instagram systematically denies the existence of “shadowbanning”, i.e. the silent suppression of the reach of posts, despite the fact that its own technical infrastructure includes this capability, even registered as a corporate patent.

Cotter coined the concept of “black box gaslighting” to describe how the platform exploits its algorithmic opacity as a discursive weapon: in the face of complaints from users experiencing drastic and unexplained drops in their visibility, Instagram responded by denying, offering alternative explanations and transferring the responsibility to the content creator.

The study showed that this strategy disproportionately affected critical and minority groups.

What Cotter described as a “formidable threat to accountability” was confirmed five years later in the Los Angeles court: Meta was not only removing content without saying so, but also suppressing its own internal investigation into the harm that content caused.

There is still a long way to go

The verdict of March 25 is historic, but it would be naïve to read it as the closure of a story. It is, rather, the opening of a file that has been accumulating for decades. Addiction by design, algorithmic gaslighting, suppression of internal research, everything that the Los Angeles jury condemned, represents just the visible tip of an iceberg whose submerged part is considerably darker.

Practices of data manipulation on a massive scale, the monetization of disinformation, the use of psychographic profiles to modify consumer behaviors and, above all, what many researchers consider the most serious of all: systematic interference in democratic processes around the world, remain unsanctioned.

That story began with Cambridge Analytica, the scandal that in 2018 revealed how data from up to 87 million Facebook users was extracted without consent and used to build highly accurate electoral profiles in the Brexit campaign. What seemed like an isolated case was spread like an oil stain: the same methodology, emotional microtargeting, exploitation of cognitive biases, algorithmic amplification of divisive content, was later documented in elections in Brazil, Mexico, the Philippines, India, Kenya and Colombia, among dozens of other countries.

Meta’s platforms not only knew that their systems could be armed for electoral purposes; in several cases, they actively facilitated such operations in exchange for advertising contracts.

That chapter, the heaviest of the file, has not yet reached any court with the weight it deserves.

In conclusion, this ruling does not change Silicon Valley in the amount of six million dollars. It changes it because it establishes, for the first time in a courtroom, that designing a platform to create addiction is a legally punishable form of negligence.

What Frances Haugen single-handedly denounced in 2021, a twelve-person jury turned into law in 2026. With more than 10,000 pending cases and potential exposure in the hundreds of billions, the tech industry is facing its tobacco moment. For Latin America, the signal is clear: regulation is no longer a possibility, it is an obligation.

Main sources consulted

NPR — Jury finds Meta and Google negligent in social media harms trial (March 25, 2026)

CNN Business — Meta and YouTube found liable in social media addiction trial (March 25, 2026)

CNBC — Jury in Los Angeles finds Meta, YouTube negligent in social media addiction trial (March 25, 2026)

NBC News — Jury finds Meta and YouTube negligent in landmark lawsuit on social media safety (March 25, 2026)

Bloomberg — Meta, Google Risk Big Tobacco-Like Fallout After Addiction Trial (March 26, 2026)

The Conversation — Meta and Google just lost a landmark social media addiction case. A tech law expert explains the fallout (2026)

The Conversation — Jury finds Instagram and YouTube addictive in lawsuit poised to reshape social media (2026)

Infobae — Meta and YouTube were declared negligent in case of social media addiction (March 25, 2026)

La Nación (Argentina) — Historic verdict: a California jury found Meta and YouTube liable (March 25, 2026)

El País de Cali — Jury in the US finds Meta and YouTube guilty of addiction in minors (2026)

TuNota Colombia — Meta faces million-dollar sentence for child addiction to social networks (2026)

The Spencer Law Firm — Social Media Addiction Lawsuits (2026): KGM Trial, MDL 3047, and TikTok & Snapchat Settlements Explained

Tech Insider — Meta Google Social Media Addiction Verdict 2026: $6M Ruling

Fox Business — Jury finds Meta, Google liable in landmark social media addiction trial, awards more than $6M (2026)

Human Rights Watch — Brazil Passes Landmark Law to Protect Children Online (September 2025)

Democracy Now! — Social Media Addiction: Facebook Whistleblower Says Big Tech Has Known & Ignored Problem for Years (February 2026)

Sen. Ed Markey (official release) — Statement on Social Media Addiction Trial Verdict (2026)

MIT Technology Review — Frances Haugen says Facebook’s algorithms are dangerous. Here’s why (October 2021)