Adam Mosseri, the head of Meta's Instagram, testified Wednesday during a landmark social media trial in Los Angeles that he disagrees with the idea that people can be clinically addicted to social media platforms.
The question of addiction is a key pillar of the case, where plaintiffs seek to hold social media companies responsible for harms to children who use their platforms. Meta Platforms and Google's YouTube are the two remaining defendants in the case, which TikTok and Snap have settled.
At the core of the Los Angeles case is a 20-year-old identified only by the initials “KGM,” whose lawsuit could determine how thousands of similar lawsuits against social media companies would play out. She and two other plaintiffs have been selected for bellwether trials — essentially test cases for both sides to see how their arguments play out before a jury.
This case is seen by some as social media’s “Big Tobacco” moment, where the product could be directly tied to addiction. Meta strongly pushes back against that connection, a spokesperson writing in a statement to Scripps News, "The question for the jury in Los Angeles is whether Instagram was a substantial factor in the plaintiff’s mental health struggles. The evidence will show she faced many significant, difficult challenges well before she ever used social media.”
Mosseri, who's headed Instagram since 2018 said it’s important to differentiate between clinical addiction and what he called problematic use. The plaintiff's lawyer, however, presented quotes directly from Mosseri in a podcast interview a few years ago where he used the term addiction in relation to social media use, but he clarified that he was probably using the term "too casually,” as people tend to do.
Mosseri said he was not claiming to be a medical expert when questioned about his qualifications to comment on the legitimacy of social media addiction, but said someone “very close” to him has experienced serious clinical addiction, which is why he said he was “being careful with my words.”
He said he and his colleagues use the term “problematic use” to refer to “someone spending more time on Instagram than they feel good about, and that definitely happens.”
It’s “not good for the company, over the long run, to make decisions that profit for us but are poor for people’s well-being," Mosseri said.
Mosseri and the plaintiff's lawyer, Mark Lanier, engaged in a lengthy back-and-forth about cosmetic filters on Instagram that changed people’s appearance in a way that seemed to promote plastic surgery.
“We are trying to be as safe as possible but also censor as little as possible," Mosseri said.
RELATED NEWS | TikTok settles as social media giants face landmark trial over youth addiction claims
In the courtroom, bereaved parents of children who have had social media struggles seemed visibly upset during a discussion around body dysmorphia and cosmetic filters. Meta shut down all third-party augmented reality filters in January 2025. The judge made an announcement to members of the public on Wednesday after the displays of emotion, reminding them not to make any indication of agreement or disagreement with testimony, saying that it would be "improper to indicate some position.”
Several parents held a vigil this week outside the courthouse, explaining that while this case isn’t specifically about each of their children, they wanted to be present so the social media platforms can be held accountable. Some parents firmly believe their loved ones would still be alive if Meta had adjusted its design practices. In a joint statement, the parents referred to social media platforms as a “trap in a space far more dangerous than anything we would ever allow in the real world.
During cross examination, Mosseri and Meta lawyer Phyllis Jones tried to reframe the idea that Lanier was suggesting in his questioning that the company is looking to profit off of teens specifically.
Mosseri said Instagram makes “less money from teens than from any other demographic on the app,” noting that teens don’t tend to click on ads and many don’t have disposable income that they spend on products from ads they receive. During his opportunity to question Mosseri for a second time, Lanier was quick to point to research that shows people who join social media platforms at a young age are more likely to stay on the platforms longer, which he said makes teen users prime for meaningful long-term profit.
“Often people try to frame things as you either prioritize safety or you prioritize revenue,” Mosseri said. “It’s really hard to imagine any instance where prioritizing safety isn’t good for revenue.”
Meta CEO Mark Zuckerberg is expected to take the stand next week.
In recent years, Instagram has added a slew of features and tools it says have made the platform safer for young people. But this does not always work. A report last year, for instance, found that teen accounts researchers created were recommended age-inappropriate sexual content, including “graphic sexual descriptions, the use of cartoons to describe demeaning sexual acts, and brief displays of nudity."
In addition, Instagram also recommended a “range of self-harm, self-injury, and body image content” on teen accounts that the report says “would be reasonably likely to result in adverse impacts for young people, including teenagers experiencing poor mental health, or self-harm and suicidal ideation and behaviors.” Meta called the report “misleading, dangerously speculative” and said it misrepresents its efforts on teen safety.
MORE SOCIAL MEDIA NEWS | Discord to roll out “teen-by-default” settings, require age verification
Meta is also facing a separate trial in New Mexico that began this week. That case differs from the Los Angeles lawsuit, centering on the allegation that Meta misrepresented the safety of its platforms and engineered algorithms to keep young people online while knowing children face risks, including sexual exploitation.
Prosecutors in New Mexico say they will present evidence that roughly 500,000 inappropriate interactions with children take place daily on Meta’s platforms and argue the company does not adequately track those interactions.
“The state cannot win this case by showing there is bad content on Facebook and Instagram,” a Meta attorney told jurors. “You must instead focus on whether Meta disclosed risks to users … and the evidence will show that Meta did disclose that.”
Meta has also taken issue with the state’s investigative methods, saying prosecutors created fake accounts with real images of minors as part of their probe. In a statement, a Meta spokesperson said the company is focused on demonstrating its “longstanding commitment to supporting young people,” pointing to the launch of teen accounts with built-in protections and expanded parental tools.
Together, the cases represent mounting legal scrutiny of social media companies and could set important precedent for a thousand more cases as courts weigh where responsibility lies for the mental health impacts of digital platforms on young users.