Massachusetts, Rhode Island, and New Hampshire are among dozens of states suing Meta over allegations that its popular Instagram app violated consumer protection laws or harmed the public in its marketing efforts towards kids.
The investigation of the company formerly known as Facebook began in November of 2021, and the lawsuits follow a long period of discussion and concern about how social media is affecting children and teens.
Here are some key details to know about the investigation.
How does social media affect kids?
Over a decade ago, studies started to uncover links between some adolescents use of social media and mental health problems, including a phenomenon dubbed “Facebook depression.”
The studies also found benefits to teens use of social media, such as fostering inclusion for LGBTQ kids. In 2016, the American Academy of Pediatrics issued guidelines recommending parents impose limits on the use of social media, without completely banning the apps.
What prompted the Meta investigation?
The multistate investigation was sparked by the so-called Facebook Files, a massive leak of internal company documents in 2021 to the Wall Street Journal by whistleblower Frances Haugen.
The leak revealed that some of Instagram’s internal research linked excessive use of the service by teenage girls with increased risk of depression, eating disorders, and suicide.
“Teens blame Instagram for increases in the rate of anxiety and depression,” a 2019 internal presentation slide stated, according to the Wall Street Journal. “This reaction was unprompted and consistent across all groups.”
The internal reports also appeared to be at odds with public statements by Meta chief executive Mark Zuckerberg. “The research that we’ve seen is that using social apps to connect with other people can have positive mental-health benefits,” Zuckerberg told House lawmakers in March, 2021, at a hearing on social media.
What has Meta said about the issue?
The company has pushed back against the reporting from the leak, arguing that its research had been taken out of context. Instagram mostly made teen girls feel better about themselves, according to the company’s surveys, Meta said.
Despite the pushback, in July, 2021, Meta announced that Instagram would add protections for teenagers, including prompting them to keep accounts private by default and imposing restrictions on the kind of advertising they were shown.
On Tuesday, Meta said that it has taken numerous steps to support teen Instagram users, including adding parental supervision tools and adding automatic reminders to take a break from the app.
“We’ll continue working with parents, experts and many others who are invested in this important issue to develop more features like these,” the company said in a statement.
What might the investigation cover?
And in November, 2021, then-Massachusetts Attorney General Maura Healey announced a multi-state investigation of Instagram’s behavior towards children and young adults.
“Facebook, now Meta, has failed to protect young people on its platforms and instead chose to ignore or, in some cases, double down on known manipulations that pose a real threat to physical and mental health – exploiting children in the interest of profit,” Healey said at the time.
The probe planned to focus on techniques used by Instagram to increase the amount of time kids spent in the app, Healey said.
The whistleblower leak also spawned investigations into the company’s content enforcement practices, policing of misinformation, and other matters, Meta disclosed in a 2021 securities filing.
The Massachusetts attorney general’s office had been critical of Instagram’s efforts to engage with kids even before the whistleblower leak.
In March, 2021, Meta said it was planning to release a version of Instagram specially tailored for children under the age of 13 who were banned from the main Instagram app. In May, Healey and 43 additional attorneys general from US states and territories wrote a letter to Mark Zuckerberg asking him to cancel the kids version.
“Use of social media can be detrimental to the health and well-being of children, who are not equipped to navigate the challenges of having a social media account,” the AGs said in the letter.
Meta ultimately abandoned the plan to release the under-13 app.