scorecardresearch Skip to main content

YouTube may be teaching someone to spy on you

Heather Hopp-Bruce/Shutterstock/Globe staff illustration

A young woman laying in bed leans into the camera shot, her face half cut off by the screen. She’s on her laptop watching videos while a shirtless man in glasses — presumably her boyfriend — sleeps next to her. They seem unaware that people are watching.

So does the French woman in the denim jacket who types at her computer, and the woman in Belgium surfing the Web in her underwear at night, her face and figure illuminated by the blue light of her screen. Then there are the three women chatting in the kitchen and the woman biting her nails in the United Kingdom.

Each person described here is a “webcam victim” — also known as a “slave” — on a YouTube channel that boasts more than 100,000 page views and 250 subscribers. Some of the videos look like intentional pornography, but a hauntingly large proportion don’t. Many appear to be people going about their days without any idea that, at any minute, hundreds of strangers may be peering in.


Even scarier? Viewers don’t have to satiate themselves with what’s merely searchable. In fact, YouTube goes so far as to automatically suggest tutorials to teach novices how to deploy such attacks — called remote access trojans, or RATS — on their own victims, random or not. Garnering thousands of hits, some of these instructional videos are marked for “educational purposes only,” while others hint about how to get away with accessing unconsenting victims.

“These are the tools of modern warfare, and it’s troubling to see a platform like YouTube, where millions of people can go and teach themselves how to do this,” says Adam Benson, the deputy executive director of an Internet safety group called the Digital Citizens Alliance.

Benson recalls schmoozing at a party in a suite of Las Vegas’s Mandalay Bay hotel in 2014, where techies and law enforcement gathered for Black Hat, an annual security conference. There, he ran into three grad students who told him where to find popular RATs like “darkcomet” and “poison ivy.” To them, it was common knowledge.


“They said, ‘Well, if you want to get a hold of them, go to YouTube. YouTube’s one of the places where you can find them right away,’” Benson recalls.

Of course, YouTube is a community of mostly innocent how-tos — lessons on everything from poaching an egg to house-training a goat. But there’s also a strain of videos that teaches users basic, unsophisticated hacking. RAT victims are sometimes touted as trophies that legitimize and distinguish hackers from the masses.

“I’m interested in this girl, and I want to get in her computer and watch her on webcam because I’m a crazy, criminal nut, right? I can learn that,” says Hemanshu Nigam.

Nigam used to work for the Department of Justice, where he specialized in computer crime, but now he runs a cybersecurity agency called SSP Blue, which consults with government bodies and businesses worldwide. He says that crime on YouTube hardly limits itself to RATs. Videos also show how to steal batches of credit card information and use it to purchase goods — how to produce counterfeit documents, synthesize drugs, and solicit prostitutes without being noticed by police. “There’s no question that YouTube kind of owns the market in the how-to space,” Nigam says.

Some of its videos speak in exclusive slang, such as “FFN” — code for Social Security numbers — or “fullz,” meaning the “full” identification of a victim, such as a street address, name, date of birth, and Social Security number. Fullz data packages in the United States ran as cheap as $15 apiece in 2014, according to a recent report by Dell, and as many as 47 percent of American adults have had their personal information stolen by hackers in a single year, according to a 2014 report by the Ponemon Institute, a security research firm based in Michigan.


Yet even though YouTube can serve as a university of crime for aspiring hackers, it’s not the only place they go to learn. Online forums are also commonly used, since users are able to help one another work through mistakes. But for wannabe hackers, the most obvious choice may be what they already know.

“YouTube is named no more than anything else [by my colleagues] because it’s almost cliche,” says Nick Selby, a Dallas-area cybercrime detective. “It’s just something you do.”

One of the very few well-known slaving cases targeted Cassidy Wolf, the 2013 Miss Teen USA winner who was photographed for nearly a year by Jared James Abrahams, then a 19-year-old computer science student from California who took control of her computer’s webcam. Abrahams showed the photos he took to his victims, then threatened to release the images if his victims didn’t perform explicit sexual acts over Skype. He posted nude images of those who refused on their own hacked Instagram accounts, later stating that he had slaved as many as 40 computers. He was sentenced to 18 months in federal prison.

Abrahams learned how to slave computers on hacking forums, according to the FBI, not YouTube. “YouTube is a good, basic kind of learning ground [for hacking], but it’s not going to give you the degree of knowledge to become a very skilled hacker,” says Tom Holt, a professor of criminal justice at Michigan State University. “But it’s going to get you started on the road, and getting that basic foundational understanding is really the most important thing.”


“It lowers the barrier of who can get into cybercrime,” Selby agrees. “You rarely see a YouTube video about how to successfully rob a bank with a gun, because that’s hard. It’s not hard to conduct a ready-made exploit on the computer.”

Even though the tier of cybercrime taught on YouTube is full of “script kiddies” — newbies who download and deploy others’ automated attacks — its widespread harm can go largely unpunished. Part of the reason is jurisdictional confusion, since so many perpetrators come from far-flung places such as Russia, Turkey, and China. But law enforcement officers are also quick to point the finger at prosecutors and their own, who often don’t receive the education necessary to pursue these cases.

“Cybercrime goes mainly uninvestigated in this country, and that really comes down to training. I’ve met prosecutors who believe that an IP address is an ‘Internet phone number,’ ” Selby says, with just a slight chuckle. “If you look at the past 10 years, the only cybercrime cases that ever get to trial are those that hit CNN or embarrass the FBI.”

Georgia Weidman, founder of Bulb Security LLC, is part of a growing community of tech-savvy people who use their skills to thwart crimes and security breaches. She calls herself a “white hat hacker,” as many of them do.

“It’s a ‘I’m not a bad guy, but I play one on TV’ sort of situation,” she jokes. Weidman spends plenty of time advising law enforcement agencies, which, she claims, are generally behind in recognizing cybercrime and how easily people can fall victim to it.


“I just don’t think the understanding is there. It’s a fairly new thing,” she says. “You can’t brush for fingerprints here.”

When crime on the Internet does become a priority for smaller agencies, it usually involves child pornography, which is certainly important but not all-encompassing of America’s online problem. “Child porn [is] a crime that we can wrap our heads around, whereas a lot of these computer crimes we don’t really understand,” Weidman says. “Law enforcement does want to deal with these issues, but they’re just not sure how.”

Google has owned YouTube since 2006, and in reporting this story, the Globe reached out to Google’s public relations team for comment several times. Eventually the company did provide a two-sentence statement.

“YouTube is a powerful platform for sharing educational videos, and we have clear policies that outline what content is acceptable to post. . . . We allow videos posted with an educational intent to remain on YouTube,” it said.

YouTube prohibits videos that instruct webcam hacking “with malicious intent,” according to Google, but it relies on users to the flag content.

“You could go a really long time with something malicious before it gets caught,” Weidman says. “But. . . that’s what we created when we made it such that everyone can make a video and make it available to anyone.” In all fairness, alongside YouTube’s tutorials of how to create RATs are also videos on how to detect them and eliminate them.

But Google’s public relations team also linked to information that implies that Benson’s group, the Digital Citizens Alliance, has a conflict of interest, including e-mails released by The New York Times in 2014 that suggest the previous Mississippi attorney general, Michael Moore, was hired by DCA to persuade the state’s present attorney general, Jim Hood, to investigate Google. The e-mails show the depth of Moore and Hood’s friendship, but also that the Digital Citizens Alliance was financially backed by members of the Motion Picture Association of America, which has lobbied against Google in the past. The DCA’s executive director, Tom Galvin, invoiced a Sony Pictures executive for more than $300,000 in 2014, according to a WikiLeaks release.

“Google has used our research to identify dozens of dangerous videos and pull them down,” Benson said in an e-mail.

It is true that, as early as 2013, DCA began calling out YouTube’s practices in a number of reports, arguing that the site benefits itself and popular channels by making ad revenue on crime-related tutorials with high page view counts. It’s unclear how much a video could make, but one YouTube channel with webcam hacking tutorials could yield up to $1,500 a year, according to SocialBlade, a popular revenue estimator. On the other hand, SocialBlade’s lowball earnings for the same channel is just $8 monthly — and there’s no way to say for certain that this particular video is monetized in the first place.

Screenshots from the DCA’s 2015 report on RATs include a video showing a “slave” alongside ads from American Express, a Batman video game, and encrypted USB drives. Thirty-eight percent of the RAT tutorials examined had similar advertisements from “well-known car companies, cosmetics, and even tickets to New York Yankees’ baseball games,” the report says.

“It’s certainly more than a little side money. People could make a good chunk of change,” Benson says. “There’s no reason why YouTube should be making money off of bad actors.”

Benson credits YouTube for eliminating child porn and videos related to human trafficking but argues that it should be assigning a human team to other videos — especially ones that show RAT victims. “If there’s anybody who can step up and solve problems like this, it would seem to be Google and YouTube. When they make things a cause and they make it an issue, they’re very good about it,” he says.

Many of the “slaved” victim videos shown in the report have since been removed, yet some RAT tutorial videos are still up and running. One 2012 tutorial lists an e-mail address in its description and says that it’s “selling slave/victims” while banner ads appear on the screen. Another 2012 video, with more than 11,000 views, shows a group of hackers harassing a teenage victim until he agrees to pay them $50 through Paypal or an online game.

But even Benson is sensitive to the value of learning from some tutorials. “It [should be] a case-by-case basis, but they should take it upon themselves to really take on a process where criminal activity is not encouraged on their platforms,” he says.

By definition, the tutorials are often useful to law enforcement officers like Selby or white hat hackers such as Weidman, who make a living by keeping on criminals’ heels.

“There’s a legitimate need to understand the fundamental weakness on the things that we bet our personal security on everyday,” Selby says. “We’d lose an important ability to be watchdogs — plus, there’s that First Amendment thing.”

Kelly Kasulis is a journalist living in Boston and the deputy digital editor of The GroundTruth Project. Follow her on Twitter @KasulisK.

Correction: A previous version of this article failed to specify that it is members of the Motion Picture Association of America that financially back the Digital Citizens Alliance, rather than the organization itself.