fb-pixel Skip to main content
TECH LAB

Here’s why Maura Healey is taking on TikTok

Prompted by media reports of Tiktok potentially cultivating harmful or life-threatening behaviors in teenagers, Massachusetts Attorney General Maura Healey has began efforts to investigate the pervasive social media app.Steven Senne/Associated Press

The leaders of Chinese social-video platform TikTok probably aren’t losing too much sleep over the investigation launched by Massachusetts Attorney General Maura Healey and her counterparts in seven other states. After all, the company survived a near-death experience in 2020, when President Donald Trump attempted to ban TikTok from the US altogether.

But the Massachusetts probe and a separate investigation begun two weeks ago by Texas officials could force some major changes in how the hugely popular media company does business. And it has major implications for other social media titans as well, especially the global giant Meta Platforms, better known as Facebook.

Advertisement



How do these companies decide which videos and news stories are shown to millions of teenage users, and which are hidden away? And do these decisions make it too easy for bad online actors to spread misleading and even dangerous images and messages? It’s hard to answer these questions without knowing a lot more about the software algorithms that make these decisions. That’s the kind of information that the state attorneys general are demanding from TikTok.

For instance, Texas Attorney General Ken Paxton wrote to TikTok demanding to know what it’s doing to screen out videos that encourage prostitution, sex trafficking, and the sexual exploitation of children. The demand includes “any written policies, procedures, employee handbooks, training and awareness materials” governing how the company’s decisions are made.

A spokesperson for Healey’s office didn’t provide as much detail about what she’s looking for. But the spokesperson said the investigation was inspired by media reports of possible harms to teenagers caused by TikTok, such as encouraging unhealthy eating habits, encouraging users to engage in risky behavior, and, of course, enticing teens to spend too much time watching TikTok.

In any case, Healey plans to see if TikTok’s business practices violate Chapter 93A, Massachusetts’ consumer protection law, which forbids unfair or deceptive trade practices. This law might not cover possible psychological harm to children, but could open the door to other charges, if there is evidence that the company tolerates illegal activity such as sex trafficking.

Advertisement



TikTok’s dealings with young users has gotten the company in trouble before. In 2019, it had to pay a $5.7 million fine to settle a Federal Trade Commission complaint that the company illegally collected sensitive information about users under age 13. Such data collection is forbidden under the federal Children’s Online Privacy Protection Act.

Since then, TikTok has added features aimed at giving parents greater control. They can link their personal TikTok accounts to those of their children, then set limits on how many hours per day a child can use the service. There’s a “restricted mode” that lets parents block videos that TikTok considers inappropriate for younger viewers. And parents can set limits on who can send TikTok messages to their children.

There is research suggesting that heavy TikTok viewing can have negative effects on teenagers. A recent Chinese research paper found that avid high school TikTok users tested higher for depression and anxiety, while posting lower scores on memory tests. And a study from the Baylor College of Medicine in Texas suggested that teenage girls who watch TikTok videos of people with an uncontrollable twitching disorder are more likely to develop the same disorder themselves.

Advertisement



Mered Parnes, a lead author of the study, said it’s long been known that people can develop a type of functional neurologic disorder by spending time with others who have it. Parnes said his research indicates that the disorder can be spread merely by watching videos of those who already have it.

But Stetson University psychologist Christopher Ferguson is skeptical about claims that social media can harm teenagers’ mental health.

Ferguson is no fan of social media companies. (“I think they’re dark hellholes of despair,” he said.) But when he and a team of colleagues reviewed 33 research papers on the Internet’s impact on mental health, they found almost no clear evidence of harm. Ferguson believes that the latest investigations are part of an ongoing “moral panic” about social media, comparable to unsupported claims that playing violent videogames can make people more violent in real life.

Healey’s probe may find no evidence that TikTok has broken the law, or that its videos are bad for teenagers. But if she forces the company to come clean about its offerings for teens, it’ll be worth the effort.



Hiawatha Bray can be reached at hiawatha.bray@globe.com. Follow him on Twitter @GlobeTechLab.