Home TechnologyInstagram’s top executive rejects claims that the platform is “clinically addictive” during a historic court Battle

Instagram’s top executive rejects claims that the platform is “clinically addictive” during a historic court Battle

by Justin
0 comments

In a closely watched courtroom showdown, Adam Mosseri, the head of Instagram, took the stand to defend the social media giant against allegations that its platform is designed to hook young users. His testimony marks a pivotal moment in the first trial of its kind against Instagram’s parent company, Meta.

The case, brought by a now 20-year-old woman identified as Kaley, accuses Instagram and YouTube of deliberately creating addictive features that harmed her mental health. The lawsuit is the first of more than 1,500 similar claims to reach trial — potentially setting the tone for how courts evaluate the responsibility of social media platforms in cases involving young users.


“Clinically Addictive”? Mosseri Says No

At the heart of the courtroom debate was a simple yet powerful question: Can someone be “clinically addicted” to Instagram?

Mosseri’s answer was clear — he does not believe so.

While acknowledging that people can engage in what he described as “problematic use,” Mosseri drew a comparison to watching television longer than one intends. “It’s relative,” he told the court, adding that individuals may sometimes spend more time on the app than feels healthy — but that doesn’t make it a clinical addiction.

He also noted that he is not a medical professional.

This testimony forms a central piece of the defense as Instagram’s top executive rejects claims that the platform is “clinically addictive” during a historic court Battle that could redefine accountability in the tech industry.


A Landmark Case With National Attention

The lawsuit is being closely followed not only because of its claims, but because of what it represents. If the jury sides with the plaintiff, it could open the door for social media companies to face broader liability over alleged mental health harms.

Kaley began using Instagram at just nine years old — despite the app’s minimum age requirement of 13. Her legal team argues that the company knowingly built features aimed at maximizing engagement among young users.

Among those features:

  • Infinite scroll

  • Autoplay videos

  • The “like” button

  • Beauty filters

According to the plaintiff’s attorney, Mark Lanier, these tools create a feedback loop that encourages compulsive behavior. He compared the “like” button to a “chemical hit” that teens seek repeatedly for validation.

Kaley’s lawsuit also alleges that Instagram’s beauty filters contributed to body image issues and that she experienced bullying and sextortion on the platform.


Beauty Filters Under Scrutiny

https://ichef.bbci.co.uk/news/480/cpsprodpb/1D61/production/_114812570_sarahface.jpg.webp
Instagram’s top executive rejects claims that the platform is “clinically addictive” during a historic court Battle
https://i.guim.co.uk/img/media/befb522ea63991b0b73bdeca86ca54a2d106d11a/0_0_5906_3543/master/5906.jpg?auto=format&fit=crop&height=1200&quality=85&s=55986c4e89e0fc59f3fee49a840401a0&width=1200
4

A significant portion of the testimony focused on Instagram’s facial filters — particularly those that alter a person’s appearance.

Internal company emails from 2019 revealed executives debated banning filters that distort facial features. One message reportedly noted experts were “unanimous on the harm.”

Initially, Instagram decided to remove filters that dramatically altered faces. Later, the company reversed course on some of those restrictions. Filters simulating plastic surgery effects — such as adding surgical scars — were banned. However, filters that subtly enhanced features, like fuller lips or a slimmer nose, were allowed to remain, though no longer actively promoted.

At the time those policy decisions were being made, Kaley was 14 years old.

The plaintiff’s legal team argues these choices prioritized growth and competitiveness — particularly in international markets — over the well-being of teenage users.


Profits vs. Protection?

Another tense moment in court came when Lanier questioned Mosseri about his compensation.

Mosseri testified that his base salary is around $900,000 per year, though his total compensation — including bonuses and stock — can exceed $10 million or even $20 million in strong years.

Lanier suggested that product decisions, including maintaining certain filters, may have been influenced by business growth goals. Mosseri rejected that implication, stating he was not concerned with how such decisions might affect the company’s stock price.

He also addressed an internal research initiative known as “Project Myst,” which reportedly examined how young users experiencing adverse effects interacted with the platform. Mosseri said he supported research efforts but did not recall specific findings from the study.


The Broader Context: Past Controversies

The trial also revived memories of earlier controversies involving Instagram’s impact on teens.

In 2021, whistleblower Frances Haugen released internal documents suggesting the company was aware Instagram could negatively affect teenage girls, particularly around body image issues.

That same year, Mosseri testified before the U.S. Senate, stating he supported stronger online safety regulations and committed to improving protections for younger users.

Since then, Instagram has introduced “Teen Accounts” with enhanced privacy settings and default content restrictions. The company has also begun rolling out AI-based age verification tools aimed at identifying underage users who falsify their birthdates.

Still, critics argue these measures came too late.


Families Demand Accountability

Outside the courthouse, the emotional stakes of the trial were on display. Parents who say they lost children due to harms connected to social media gathered overnight hoping to observe the proceedings.

One parent, whose teenage daughter died after being connected through Instagram to someone who later provided her with a fentanyl-laced pill, told reporters they would continue fighting for change.

Their presence underscores why this case carries such weight: it represents not just a legal argument, but a deeply personal battle for many families.


A Legal Wall: Section 230

Much of the courtroom strategy is shaped by Section 230 — a federal law that protects technology companies from being held liable for user-generated content.

Because of that law, the jury may not hear extensive arguments about specific posts or messages Kaley encountered on Instagram. Instead, the focus remains on whether the platform’s design itself — its structure and features — played a substantial role in her mental health struggles.

Meta’s legal team has argued that Kaley faced significant personal challenges before ever using social media, and that Instagram was not the primary cause of her difficulties.


What Happens Next?

As testimony continues, the outcome of this trial could have far-reaching consequences for social media platforms across the globe.

If the jury determines that Instagram’s design substantially contributed to harm, it could reshape how tech companies approach engagement features, youth safety tools, and internal research transparency.

For now, one thing is clear: Instagram’s top executive rejects claims that the platform is “clinically addictive” during a historic court Battle, but the jury — and perhaps the future of social media regulation — will ultimately decide how persuasive that defense truly is.

You may also like