When Kate’s 13-year-old son took up Minecraft and Fortnite, she did not worry. He played in a room where she could keep an eye on him.
But about six weeks later, Kate saw something appalling pop up on the screen: a video of bestiality involving a young boy. Horrified, she scrolled through her son’s account on Discord, a platform where gamers can chat while playing. The conversations were filled with graphic language and imagery of sexual acts posted by others, she said.
Her son cried when she questioned him last month.
“I think it’s a huge weight off them for somebody to step in and say, ‘Actually this is child abuse, and you’re being abused, and you’re a victim here,’” said Kate, who asked not to be identified by her full name to protect her family’s privacy.
Sexual predators have found an easy access point into the lives of young people: They are meeting them online through multiplayer video games and chat apps .
Criminals strike up a conversation and gradually build trust. Often they pose as children. Their goal, typically, is to dupe children into sharing sexually explicit photos and videos of themselves — which they use as blackmail for more imagery.
Reports of abuse are emerging with unprecedented frequency around the country, with some perpetrators grooming hundreds and even thousands of victims, according to a review of prosecutions, court records, law enforcement reports, and academic studies.
The New York Times reported earlier this year that the tech industry had made only tepid efforts to combat an explosion of child sexual abuse imagery on the Internet. The Times has also found that the troubled response extends to the online gaming and chat worlds.
There are tools to detect previously identified abuse content, but scanning for new images is more difficult. While a handful of products have detection systems in place, there is little incentive under the law to tackle the problem, as companies are largely not held responsible for illegal content posted on their websites.
Six years ago, a little more than 50 reports of the crimes, known as “sextortion,” were referred to the federally designated clearinghouse in suburban Washington that tracks online child sexual abuse. Last year, the center received more than 1,500. And authorities believe that the vast majority of cases are never reported.
There has been some success in catching perpetrators. In May, a California man was sentenced to 14 years in prison for coercing an 11-year-old girl “into producing child pornography” after meeting her through the online game Clash of Clans. An Illinois man received a 15-year sentence in 2017 after threatening to rape two boys in Massachusetts whom he had met over Xbox Live.
“The first threat is, ‘If you don’t do it, I’m going to post on social media, and by the way, I’ve got a list of your family members, and I’m going to send it all to them,’” said Matt Wright, a special agent with the Department of Homeland Security. “If they don’t send another picture, they’ll say: ‘Here’s your address — I know where you live. I’m going to come kill your family.’”
The trauma can be overwhelming for the young victims. An FBI study reviewing a sample of sextortion cases found that more than one-quarter of them led to suicide or attempted suicide.
It makes sense the gaming world is where many predators would go. Almost every teenage boy in America — 97 percent — plays video games, while about 83 percent of girls do, according to the Pew Research Center.
In many states, gaming counts as a team sport. Colleges offer scholarships to elite gamers, and cities are racing to establish professional teams. The industry is enormously profitable, generating over $43 billion in revenue last year in the United States.
There are many ways for gamers to meet online. They can use chat features on consoles like Xbox and services like Steam or connect on sites like Discord and Twitch. The games have become extremely social, and developing relationships with strangers is normal.
“These virtual spaces are essentially hunting grounds,” said Mary Anne Franks, a professor at the University of Miami School of Law and president of the Cyber Civil Rights Initiative, a nonprofit group dedicated to combating online abuse.
This fall, the FBI rolled out an awareness campaign in middle and high schools to encourage children to seek help when caught in an exploitive sexual situation. “Even if you accepted money or a game credit or something else, you are not the one who is in trouble,” material from the campaign explains.
New Jersey police departments were flooded with phone calls from parents and teachers alarmed about pedophiles lurking on game sites and in chat rooms. So law enforcement officials from across the state last year started chatting under assumed identities as children.
In less than a week, they arrested 24 people.
Authorities did it again, this time in Bergen County. They made 17 arrests. And they did it once more, in Somerset County, arresting 19. One defendant was sentenced to prison, while the other cases are still being prosecuted.
After the sting, officials hoped to uncover a pattern that could help in future investigations. But they found none; those arrested came from all walks of life.
When announcing the arrests, authorities highlighted Fortnite, Minecraft, and Roblox as platforms where offenders began conversations before moving to chat apps. Nearly all those arrested had made arrangements to meet in person.
In a separate case in Ohio, the digital abuse of a young boy led to his physical abuse. The offender, Jason Gmoser, would encourage boys to show their genitals while on PlayStation, according to court records. Gmoser told detectives in 2014 that he spent years interacting with an 8-year-old, traveling to Missouri to visit the boy and his family, showering them with gifts and paying some of their bills. On at least one trip, he said, he sexually abused the child. He is now serving a life sentence in a separate case, for running a child sexual abuse site on the dark web.
There are a few seemingly simple protections against online predators, but logistics, gaming culture and financial concerns present obstacles.
Microsoft, which owns Xbox and Minecraft, said it planned to release free software early next year that could recognize some forms of grooming and sextortion.
Other companies have taken a more hands-off approach, citing privacy concerns.
Some of the biggest gaming companies provided few, if any, details about their practices. Epic Games, creator of Fortnite, which has roughly 250 million users, did not respond to multiple messages seeking comment.