Business

33 U.S. states sue Facebook and Instagram owner for making social media addictive to kids

Thirty-three states, including California and New York, are suing Meta Platforms Inc. for harming young people's mental health and contributing to the youth mental health crisis by knowingly designing features on Instagram and Facebook that cause children to be addicted to its platforms.

Lawsuit alleges Meta has features aimed at teens despite knowing harm they cause

Social media influencer Malini Agarwal smiles as she looks at her phone
Indian social media influencer Malini Agarwal smiles as she checks social media in the back seat of a car. A lawsuit launched in the U.S. on Tuesday alleges that the owner of Facebook and Instagram knowingly puts out features on its products designed to get young people to use them despite being bad for their mental health. (Vivek Prakash/Reuters)

Thirty-three states including California and New York are suing Meta Platforms Inc. for harming young people's mental health and contributing to the youth mental health crisis by knowingly designing features on Instagram and Facebook that cause children to be addicted to its platforms.

The lawsuit, filed in federal court in California, also claims that Meta routinely collects data on children under 13 without their parents' consent, in violation of federal law.

"Kids and teenagers are suffering from record levels of poor mental health, and social media companies like Meta are to blame," New York Attorney General Letitia James said. "Meta has profited from children's pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem."

The broad-ranging civil suit is the result of an investigation led by a bipartisan coalition of attorneys general from California, Florida, Kentucky, Massachusetts, Nebraska, New Jersey, Tennessee and Vermont.

"Research has shown that young people's use of Meta's social media platforms is associated with depression, anxiety, insomnia, interference with education and daily life, and many other negative outcomes," the complaint said.

It follows damning newspaper reports, first by the Wall Street Journal in the fall of 2021, based on Meta's own research that found that the company knew about the harms Instagram can cause teenagers — especially teen girls — when it comes to mental health and body-image issues. One internal study cited 13.5 per cent of teen girls saying Instagram makes thoughts of suicide worse and 17 per cent of teen girls saying it makes eating disorders worse.

Following the first reports, a consortium of news organizations, including The Associated Press, published their own findings based on leaked documents from whistleblower Frances Haugen, who has testified before U.S. Congress and a British parliamentary committee about what she found.

WATCH | Facebook whistleblower says governments need to rein in Big Tech: 

Former Facebook data scientist asks Congress to intervene in social media company’s actions

3 years ago
Duration 2:37
Former Facebook data scientist-turned-whistleblower Frances Haugen urged U.S. lawmakers to intervene in the social media giant's operations. Speaking before a Senate panel, Haugen outlined how Facebook knew its products and algorithms were steering users toward dangerous and toxic content, yet did nothing about it.

The use of social media among teens is nearly universal in the United States and many other parts of the world. Up to 95 per cent of youth from the ages of 13 to 17 in the U.S. report using a social media platform, with more than a third saying they use social media "almost constantly," according to the Pew Research Center.

To comply with federal regulation, social media companies ban kids under 13 from signing up to their platforms — but children have been shown to easily get around the bans, both with and without their parents' consent, and many younger kids have social media accounts.

Loose enforcement

Other measures that social platforms have taken to address concerns about children's mental health are also easily circumvented. For instance, TikTok recently introduced a default 60-minute time limit for users under 18. But once the limit is reached, minors can simply enter a passcode to keep watching.

In May, U.S. Surgeon General Dr. Vivek Murthy called on tech companies, parents and caregivers to take "immediate action to protect kids now" from the harms of social media.

The lawsuit seeks a variety of remedies, including substantial civil penalties.

Meta said that it had sought to make young people safe online.

"We're disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path," the company said in a statement.