UN human rights investigators increase tracking of Internet hate speech
HONG KONG — UN human rights investigators are increasingly combing cyberspace to track how websites can stoke hatred and possible violence as part of expanding forensics into the role of the digital world in modern conflict.
The influence of online anger and propaganda has been assessed for nearly a generation and is now part of the routine casework by security forces around the world.
But the United Nations — whose reports are often crucial for possible international prosecutions — is now trying to catch up after years of relying mostly on firsthand reports from the field.
Rights investigators and monitors have used information from open-source Internet sites — including videos, satellite imagery, and inflammatory posts — to strengthen traditional fact-finding in flash points such as the Himalayan region of Kashmir, Gaza, and Myanmar’s Rakhine state.
The world body also has dispatched a veteran human rights official to Silicon Valley to build relationships with technology companies.
Félim McMahon, who directs the technology and human rights program at the University of California at Berkeley law school’s Human Rights Center, described the United Nations’ pace of reform as ‘‘turning several battleships tied together.’’
The United Nations’ human rights office, however, has now realized, ‘‘We need to have our small teams, not just in the field, but on the Internet,’’ McMahon said.
‘‘This is essentially putting the UN at the cutting edge of this investigative opportunity. In terms of arriving at the scene of a crime, they are going to be the first ones there,’’ he added.
A telling moment came in March.
Members of a UN fact-finding mission formed to look into years of alleged abuses by Myanmar’s military took part in a teleconference from Geneva with a group of officials from Facebook, which is hugely popular in the Southeast Asian country.
The topic that day was the role Facebook played as the platform for hateful rhetoric by Myanmar’s military commanders against the minority Muslim Rohingya community. A UN report in August concluded that Myanmar’s campaign of violence and forced expulsions against the Rohingya bore the hallmarks of genocide and called for military generals to be investigated and prosecuted.
‘‘Initially, of course, [Facebook representatives] were very defensive and reluctant to recognize that Facebook was, in fact, if not the instigator, then the facilitator of hate speech in Myanmar,’’ Marzuki Darusman, head of the fact-finding mission, said in an interview.
Darusman later told the United Nations in New York that ‘‘genocidal intent’’ was apparent in the Facebook posts by military officers.
Scott Campbell, a longtime officer at the UN High Commission for Human Rights, now spends much of his time in the tech campuses of Silicon Valley.
The former commissioner of the rights panel, Zeid Ra’ad al-Hussein, put Campbell in charge and ramped up efforts at dialogue with tech companies. Hussein feared that the United Nations risked becoming irrelevant if it didn’t make inroads with global tech giants such as Facebook and Microsoft.
The Internet is a ‘‘fantastically powerful’’ tool for ‘‘empowering people and enhancing their human rights on the one hand,’’ said Campbell, who spent most of his UN career in central Africa.
‘‘On the other hand, the Internet has been used as a medium through which hate speech can be propagated with previously unthinkable speed and scale. And in reference to Myanmar, sometimes with absolutely catastrophic effects,’’ he added.
Campbell is working to build relationships with technology firms along with universities and organizations that can assist the United Nations with investigating human rights issues. He also recently brought UN experts to tour technology companies to explain their work.
The United Nations’ investigation into the Myanmar military was a test case.
It was bolstered greatly by open-source data pulled from social media platforms, as well as satellite imagery. Posts on Facebook by Myanmar military chiefs and others are cited extensively in the UN report on the Rohingya crisis.
The fact-finding mission obtained two caches of Facebook posts from an independent researcher contacted in April. This included around 200 posts that were deleted but that the researcher previously saved, according to a person familiar with the group’s work who was not authorized to speak to the media. Investigators also received multiple submissions of data from the Berkeley professor McMahon and his students.
This information was particularly useful because UN investigators were barred by Myanmar. Officials in Myanmar have dismissed the report as one-sided.
UN investigators are using ‘‘cautious optimism’’ with Internet-based evidence, said Yvonne McDermott Rees, an associate professor of legal studies at Swansea University in Wales, who researches the use of open-source material by fact-finding missions.
‘‘The investigators are aware that they face massive access issues and that open-source evidence provides a way to overcome some of those issues, but there is still a good degree of caution.’’
Among the red flags is the risk of being duped by fake or doctored postings.
‘‘Every type of evidence that has ever been used has its pros and cons,’’ she said.
Facebook, in August, removed the accounts of a number of top Myanmar military officials. The company also took down pages linked to ‘‘coordinated inauthentic behavior’’ carried out by the military and did so again in October. The content of the pages has been preserved, the company said.
Facebook’s August decision came after it found out the company would be pinpointed in the UN report.
Facebook has said repeatedly that it was too slow to act on problems in Myanmar.
‘‘We should have taken action sooner and in a more comprehensive way,’’ said a statement from a company spokeswoman.
Earlier this month, Facebook took another step in developing possible partnerships with regulators and others. The company agreed to a six-month arrangement with French officials seeking to study Facebook’s efforts to combat hate speech.
‘‘As Mark Zuckerberg has said, with the Internet growing in importance in people’s lives we believe that there will be need for regulation,’’ Nick Clegg, Facebook’s vice president for global affairs, said after the French deal was announced.
‘‘The best way to ensure that any regulation is smart and works for people is by governments, regulators, and businesses working together to learn from each other and explore ideas,’’ he added.