When Sarah Cooper was a teenager, Facebook was her best friend.
Dashing off messages to lots of people without actually having to see them was an ideal way for an insecure girl to feel popular. She had multiple Facebook accounts besides the one her protective mother logged on to from time to time, to keep tabs on her. By the time she was 16, she already had more than 1,000 Facebook friends. “It was like a status thing,” says Sarah, now 23. “I would literally go through Facebook and just add random people.”
So when she was 15 and got a Facebook request from a male stranger, she breezily accepted it, guessing he was a friend of a friend or that he’d just found her randomly. She didn’t know his actual name — like many of her online connections, his username wasn’t actually a name. And his profile picture wasn’t actually a picture: He used images of cars or dogs or cartoon characters. He seemed to be around her age, though, judging from everything they had in common.
They connected over music, like pop artists Kesha and Nicki Minaj. They chatted about friends, what they ate, how she loved to read, especially the Twilight books. She told him about her frustrations with her adoptive parents, who tried to know everything going on in her life, which only made her more secretive. He told her he could relate. Things were “rocky” at the time with her mother, Sarah says, but so easy with him that she’d sometimes skip her karate class or family meals to keep chatting.
“I’d describe it as, he adored me,” Sarah recalls. “He put me on a pedestal.” She was so flattered by his attention that when he asked her to send him sexually explicit photos of herself, she complied. They chatted on and off for about two years, mostly on Facebook’s Messenger app, but after she turned 18 in 2015, he asked to meet her in person at an intersection near her apartment, where she’d moved after graduating from high school. As soon as she saw him, she knew something didn’t add up: He looked a lot older than he’d let on — closer to 40 than 18.
This man she now calls “J” (she never learned his real name) was no friend, Sarah soon discovered. He was a sexual predator — an enterprising one — grooming Sarah by patiently playing a long game on Facebook until she was no longer a minor and he’d gained her trust. She later learned he was doing the same thing with other girls.
Voice trembling, Sarah alludes to the acts of violence and degradation over 10 days beginning with their first in-person encounter. “You can’t realize how disgusting and vile people are until they are doing something awful to you,” she says.
She recalls that J drove her to a house in a Boston neighborhood where some of his friends were waiting. They forced her to drink shots of alcohol and ingest a few lines of cocaine. As a rebellious teenager prone to risk-taking, neither substance was new to Sarah, but what came next was: She was made to have sex with J and another woman while someone else took a video on a cellphone. They told her it was an “audition.” What they didn’t say was that she about to be abducted and forced into sex slavery.
The next day J drove her to a motel in New York State and locked her into a dingy room guarded day and night by armed men. There were other girls in the motel too, all there to service a steady procession of paying customers, with freebies for J’s friends. They took away her phone for a few days and gave her a different one for keeping in touch with customers. When she begged to leave, they instructed her to use alcohol and drugs, “literally staying in the room until I was done.” Sarah hoped that maybe, if she fell asleep, she wouldn’t wake up.
With each new man she’d think: Will this be the person who’ll help me get out? One time, a man who told her he was a paramedic asked her why she was doing this. “It felt like a punch to the gut, literally,” she says. “I was just overwhelmed by guilt and shame.”
After a week and a half of being held hostage, she seized on a brief moment when the guards were inattentive and called a male friend who lived near Boston, pleading to be rescued. When he drove up around 4 a.m., “I literally dove into his car,” Sarah says. She and the friend say that when J’s guards tried to box them in with several cars, they backed over two median strips in the motel parking lot to escape, and then rocketed down the road, back to Boston.
Associate Deputy Attorney General Stacie Harris has the unenviable title of national coordinator for child exploitation and human trafficking at the US Department of Justice.
She’s often called on to give talks about the digital underworld of child abuse and exploitation. But the scale and nature of the problem are difficult to convey, she says, and not just because people can’t stomach hearing about it. It’s because it belies all concepts of normal human behavior.
Contrary to what many people think, here is what the problem is not: The domain of a small fringe group of dysfunctional loners browsing illicit images in the dark of night. “It is prevalent and common,” Harris says. “And things we would not let happen in the physical world we allow to happen on the Internet.”
A group of survivors from across North America calling themselves the Phoenix 11 have described in a statement some of the abuse they endured as children, which was recorded and distributed online: “sexual torture, child rape, erotic photo shoots, pedophile sleepovers, elementary school sex shows” — the list goes on.
In the real world, sexual exploitation of children online is a highly evolved and diversified underground industry that even victimizes infants. It offers “unfettered access to children and no guidelines,” says Signy Arnason of the Canadian Centre for Child Protection, based in Winnipeg, Manitoba. “Find me another environment in the offline world where we would permit that.”
The National Center for Missing & Exploited Children — a nonprofit clearinghouse and centralized reporting center headquartered in Virginia — has compiled a glossary of exploitation terms to explain these unimaginable crimes. “Online enticement” occurs on everyday Internet platforms such as those for social media, online gaming, or e-mail. It often involves engaging in sexual conversation or role-playing; enticing a child to share sexually explicit images; selling or trading those images to others; and even meeting in person for sexual purposes. With “sextortion,” a predator coerces, deceives, or blackmails a child into engaging in sex, sending sexual images, or sending money.
And there are other Internet-enabled horrors, including live-streamed abuse where the viewers direct the action, and everything in between, including what happened to Sarah Cooper. She still doesn’t know whether the video shot of her “audition” is circulating online. Was it meant to be used to blackmail her? Or to sell online, for profit? Many other victims like Sarah “live with the debilitating fear the photos and videos memorializing their sexual abuse as a child and shared on the internet will forever remain online for anyone to see,” according to “Captured on Film,” a 2019 white paper produced by the National Center for Missing & Exploited Children.
Child trafficking and pornography aren’t new. But in pre-Internet times, those who abused children and took photos would either have to develop the film themselves or take it for processing, Harris says. “And someone would spot it.” But now all it takes is a device or two, a bit of technological know-how — and a lot of bored, lonely, and vulnerable kids glued to social media sites.
A cellphone is “a perfect tool for child sex exploitation,” says Harris. “You can take high-res photos and videos and save it or transfer it to the cloud and no one will ever know. And on the dark corners of the Web they find communities of other people like themselves that validate what you’re doing, and you can share ideas and stories.”
As much damage as the COVID-19 pandemic has wrought to people and businesses, it has benefited the exploitation industry. As more kids engage with devices for longer periods of time, so-called cappers — creeps who record nude or sexual audio and video via webcam of children they target on live-streaming platforms or apps — are ready for them.
“Now is the time for cappers to do their part to assist the quarantine efforts,” reads a post captured on a dark Web forum by the Canadian Centre for Child Protection. Since the pandemic began, the National Center for Missing & Exploited Children CyberTipline has seen a disturbing spike in reports.
‘We need social media to monitor child sexual images and allow police to prosecute online child predators. This could have made a difference in my story.’
For a long time, Sarah was quiet about what had happened, not even telling her mother, Lisette Cooper, though Sarah occasionally alluded to it in vague terms. Lisette inferred only that something “not good” had happened to her daughter in New York.
Yet had it not been for Lisette, Sarah’s life might have spun into a debilitating, endless cycle of trauma. Many survivors of online child sexual abuse struggle with multiple problems, including anxiety, nightmares, insomnia, shame, panic attacks, and ongoing fear of their abuser, research shows. Adding another layer of angst is knowing that explicit photos and video of themselves can forever remain online and they may be recognized.
But serendipitously, Sarah’s mother had independently been developing an interest in preventing online child sexual exploitation, oblivious to her own daughter’s victimization under their roof. A pioneer in impact and socially responsible investing, Lisette was running her own wealth management firm, Athena Capital Advisors, ranked among the Top 100 Women-Led Businesses in Massachusetts by The Commonwealth Institute in 2019. She had long worked with female clients who wanted to use their wealth to empower women. This urge escalated “to a roar,” she says, after Hillary Clinton lost the 2016 presidential election, and grew even more with the 2017 women’s marches and the #MeToo movement. “The dam broke,” she says. “From Hollywood to Wall Street to shareholders like my clients, women just had to have courage, step up, and do whatever they could to make sure sexual aggression and oppression [wouldn’t] happen to other women.”
Lisette accepted a position as vice chair of Fiduciary Trust International, and began leading events across the country for women impact investors, bringing in speakers to talk about issues affecting women and children where, as shareholders, they could influence corporate policy. One presenter described an epidemic of children being sexually exploited via the Internet, a problem growing exponentially. “We were horrified,” Lisette says.
She started working with stakeholders and child protection experts to try to force technology companies to take aggressive action on their own platforms to deter exploitation of children. Looming largest: Facebook with its Messenger, WhatsApp, and Instagram platforms, which last year were the source of nearly 16 million reports of child pornography, abuse, or exploitation, according to the National Center for Missing & Exploited Children.
Last year, founder and CEO Mark Zuckerberg announced Facebook’s plans to expand its use of end-to-end encryption for user privacy. Government and law enforcement agencies and child protection groups say end-to-end encryption would also afford privacy to predators, the worst scenario for victims and potential victims.
Take away the self-reporting by Facebook in 2019, and the number of incidents reported by online platforms nose-dives from a total of 16.9 million to about 1 million, including the 150,000 coming from the general public, according to the National Center for Missing & Exploited Children. Facebook will consult with experts, law enforcement, and governments on safety measures, Zuckerberg wrote last spring. “But we face an inherent tradeoff because we will never find all of the potential harm we do today when our security systems can see the messages themselves.” A company spokesperson told the Globe Magazine, “These are horrific crimes. We will absolutely not tolerate any behavior or content that exploits children on our platform.”
With funding from her Wisdom Lotus Foundation and women impact investors, Lisette Cooper hired a firm to organize a news conference representing Facebook shareholders, including herself, who filed a resolution to pressure the company to reconsider its encryption plan. In May as she prepared for the conference, Lisette casually asked Sarah if she’d ever had a sexual experience on Facebook. Lisette expected she might hear about “the usual teeny-bopper encounters.” Instead, the New York story spilled out.
In that instant, the trajectory of both their lives changed. Now, as a team, mother and daughter are speaking out on a global stage to bring awareness to Internet sexual exploitation and violence against children “so people will know how this problem is right under our noses,” Lisette says.
Anguishing as it is, Sarah decided to tell her story as a cautionary tale for other young people. It’s about “taking back my life,” she says. This work has already begun: Sarah is now close to completing an undergraduate degree in psychology and works as a residential counselor for kids in state custody. She sees a therapist. “But it’s not like I just woke up and everything is better,” she says. “I am still struggling with this.”
Sarah’s kept her support circle small, limited to those she can turn to when nightmares or triggering moments overwhelm her. And of course, her beloved dogs, Brodi, a lovable, lumbering Saint Bernard, and her “soul mate” Accallia, a Shetland taken from a puppy mill to be sold. “I saw the hurt in her eyes,” Sarah says. “I couldn’t leave her.”
Last year, researchers from Google and the National Center for Missing & Exploited Children teamed up to analyze online child sexual abuse imagery, or CSAI, from 1998 to 2017. The report acknowledged efforts by platform operators Microsoft, Google, Facebook, and Twitter to block attempts to access or distribute these images, but noted: “CSAI has grown exponentially — to nearly 1 million detected events per month — exceeding the capabilities of independent clearinghouses and law enforcement to take action.” The researchers concluded that “online sharing platforms have accelerated the pace of CSAI content creation and distribution to a breaking point.”
Law enforcement worldwide assiduously pursue criminal communities and Web forums that foster such abuse. Still, you can hear frustration in the voices of those working in the trenches when they talk about what they’re up against. Parents not educating their kids (or themselves) about the dangers. Sparse Internet safety education in schools or in the form of public service announcements. A failure or reluctance of companies to do more to contain it, and of legislators to enact stringent regulation.
“Every time someone on [Capitol] Hill tries to do something, Silicon Valley stymies it,” says Stacie Harris from the Department of Justice.
Brian Neil Levine, director of the Cybersecurity Institute at the University of Massachusetts Amherst, puts the problem in historical context: “It’s been reported that Facebook was responsible for almost 16 million reports of child sexual abuse material last year,” he says. “It does not appear to be a responsible decision to turn off their ability to observe and report child sexual abuse. I wonder if in hindsight we’ll view some tech platforms as the cigarette companies of our time.”
These failures put much of the onus of breaking the cycle of abuse on advocacy organizations such as the National Center for Missing & Exploited Children, the Canadian Center for Child Protection, and the WePROTECT Global Alliance. Smaller organizations such as Massachusetts-based My Life My Choice, which offers survivor-led programs to end sexual exploitation of children, amplify the voices and advocacy of young people on a local level.
‘We take very seriously the thousands of Sarahs that we represent at the National Center for Missing and Exploited Children.’
John F. Clark, president and CEO
Whether advocates’ efforts can ramp up to the size and effectiveness of a #MeToo movement is an open question. Sarah Cooper says the women behind #MeToo paved the way for her to be able to speak out publicly. “It was just so powerful to see the unity among women, the sense of being seen,” she says. “The movement takes the stigma off of trauma.”
To grow a movement that can effect social change takes time. Cass R. Sunstein, a Harvard law professor and author of How Change Happens, says the #MeToo movement stands on the shoulders of earlier women who spoke out. He points to Catharine A. MacKinnon, author of the landmark 1979 book Sexual Harassment of Working Women, as one of the catalysts. “Other people will then act or speak out just because someone else came first,” Sunstein says. “Another group will do it if two sets of people act or speak, and others need three or four,” and it cascades from there.
Sarah Cooper may be standing on the shoulders of Alicia “Kozak” Kozakiewicz, now 32, who claims the unwanted distinction of being, as her website says, “the first widely reported Internet-related child abduction victim.” In 2002, after being groomed online by a predator, she was taken outside her home in Pittsburgh and chained by the neck to her kidnapper’s basement floor, where he sexually assaulted her and live streamed the attack. Her highly publicized rescue came after the FBI got a tip from someone who saw the video.
Kozakiewicz went public at 14 because she felt she had no choice. “My story was already out there and people were telling it their own way, and it wasn’t very helpful,” she says. “It was my story. My whole goal was to save one child, one person.” She founded The Alicia Project to promote awareness of child safety issues online, advocate for missing persons, and fight child exploitation and human trafficking. She also works to secure the passage of her namesake Alicia’s Law, which provides state-specific funding to the Internet Crimes Against Children task forces, to help them rescue endangered children. So far Alicia’s Law has been passed in 13 states, though Massachusetts is not among them.
Since Kozakiewicz came forward, other online abuse victims have told their stories, but most have chosen anonymity, fearing retaliation from predators. The Phoenix 11 work with the Canadian Centre for Child Protection to advocate, educate, and fight for change in the technology industry. “We are redefining what it means to be victims who were powerless to stop the relentless onslaught of the technology of abuse,” they say on video, their faces unidentifiable. “We will not be stopped.”
Nor will Sarah Cooper. On May 20, she told her story at the online news conference Lisette had spearheaded. And on June 9, she spoke at a webinar on the European Union’s response to combating child sexual abuse and exploitation, joining the executive director of Europol, the minister of Justice and Security of the Netherlands, and other heavy hitters. (She used her married name during these public speaking events; she’s divorcing and goes by Cooper again.)
Sarah recounted her personal experience and advocated specific approaches to prevention, detection, and support for victims and families. Justice becomes problematic without evidence, the kind end-to-end encryption makes impossible to get, she told her listeners. “We need social media to monitor child sexual images and allow police to prosecute online child predators. This could have made a difference in my story. If someone [had] flagged those grooming messages and photographs, I may not be talking today,” she said.
John F. Clark, president and CEO of the National Center for Missing & Exploited Children, was the next speaker after Sarah. “In many respects, what Sarah just said can summarize practically everything I would have to say about child safety,” he said. “We take very seriously the thousands of Sarahs that we represent at the National Center for Missing & Exploited Children.”
In the 12 years UMass’s Brian Levine has been working to thwart child exploitation, listening to Sarah was the first time he’d seen and heard a victim speak publicly. “I was deeply moved by her story and by her bravery in telling it,” he says. “She’s a true hero, fighting back eloquently and with heart. Her words were simply inspiring.”
Lisette and Phil Cooper adopted Sarah after seeing her featured in a 2008 Wednesday’s Child adoption segment on WBZ-TV. She seemed to fit right into their blended family with five other kids. She joined the school track team. “She was spunky, resourceful, and sparkly,” says Lisette, now divorced from Phil.
But Sarah carried the sorrow and trauma of being shuffled between foster homes, some of which she associates with deeply painful experiences, since age 2. “After she was adopted, she fell apart,” says Lisette. “All of the pain and trauma . . . tumbled out.” Sarah’s parents sent her to a treatment program in Missouri for traumatized foster and adopted children; the program’s therapy dogs helped the kids build relationships and experience love. “The dogs saved me,” Sarah says.
Her teenage years were hard on everyone, and Sarah changed her phone number after she moved out, making it hard for Lisette to reach her. Lisette only heard from her once during the time she was in New York — a quick ghost of a call, with a small voice saying, “I miss my mom.”
Mother and daughter hold hands when they tell this story, their story. Sarah wants to develop teaching material about online sexual abuse, to work with law enforcement, to write a book, to make sure kids are educated to know that predators are tricky, using cute emojis in their texts and acting like your friend. J and his armed guards may still be out there, victimizing other girls. Afraid he’d find her and retaliate, Sarah didn’t report her abduction to the police, and Lisette didn’t find out about it until years later.
There’s a gentleness to Sarah but also a fierce determination that she attributes to her mother’s influence, despite their differences during Sarah’s teen years. “Over the years the essence of Lisette Cooper made me into a strong and powerful woman,” she says. “She’s a strong and confident woman who takes these big issues and attacks them with her soul.”
In his book on change-making, Harvard’s Sunstein wrote, “What was once unsayable is said, and what was once unthinkable is done.” Sarah hopes her face will be but one among a movement of victims who are able to say the unsayable.
“We cannot harm our children anymore,” she says tearfully. “We have to step up, as adults. I don’t want anyone to go through what I went through, ever again.”
Report it: If you believe a child is being exploited online, contact the National Center for Missing & Exploited Children CyberTipline at cybertip.org. Or, call 1-800-843-5678, 24 hours a day, seven days a week.
Resources for victims and families: NCMEC educational materials are at tinyurl.com/NCMEC-tips.
Linda Matchan is a journalist and documentary filmmaker in Boston. She can be reached at firstname.lastname@example.org. Send comments to email@example.com.
Linda can be reached at firstname.lastname@example.org