Editorials

Editorial

No more snuff videos on Facebook

Conference workers speak in front of a demo booth at Facebook's annual F8 developer conference, Tuesday, April 18, 2017, in San Jose, Calif. (AP Photo/Noah Berger)
AP Photo/Noah Berger
A demo booth at Facebook's annual F8 developer conference.

On Easter Sunday, Facebook users watched the last harrowing moments of Robert Godwin Sr.’s life. This was not Godwin’s decision, but that of Steve Stephens, who, police say, shot Godwin to death on a Cleveland street before posting a video of the crime to the social media site.

That video remained on Facebook for two hours before it was finally removed. Godwin’s murder was not, as some early reports claimed, shown on Facebook Live, but that should not stop legitimate scrutiny on the social media giant’s efforts to regulate violent or objectionable content.

In a statement Monday, Facebook called Godwin’s murder “a horrific crime” and said it does not “allow this kind of content on Facebook.”

Advertisement

Lately, the opposite has appeared to be true. In January, two men and two women were arrested after broadcasting themselves torturing a young man with special needs. Last month, five to six assailants sexually assaulted a teenage girl as at least 40 people watched the crime on Facebook Live. (No one called the police.) Last October, a man who murdered two relatives and injured two police officers used the popular service to share parts of his crime spree.

Get Arguable with Jeff Jacoby in your inbox:
From the Globe's must-read columnist, an extra offering each week of opinion and ideas.
Thank you for signing up! Sign up for more newsletters here

Stephens streamed two videos on Facebook Live in which he boasted of committing other, unconfirmed, murders. He killed himself Tuesday when police attempted to pull his car over in Pennsylvania.

In a Facebook post, Justin Osofsky, the company’s vice president of global operations, said, “We know we need to do better.”

At Facebook, content encouraging or celebrating violence is prohibited, yet the company is less than rigorous in monitoring what gets posted on its site. At present, Facebook allows its 1.86 billion active monthly users to police the site’s content. That means users are the first line of defense; when a video is flagged as offensive, then Facebook’s team takes a look to decide if it should be removed. Yet by that point, countless people have probably already watched and saved it. Facebook is reviewing how users flag videos, Osofsky says.

Facebook founder Mark Zuckerberg has often rejected calls to use algorithms to catch and remove offensive content. He wants users to feel unhindered in posting photos and videos, and filters would disrupt what he views as an egalitarian exchange of ideas and images.

Advertisement

Zuckerberg must reconsider. Yes, there are millions of innocuous videos posted and streamed daily, and Facebook is not responsible for crimes its users commit. Still, like other social media sites, but with the greatest reach, Facebook gives criminals an instant platform to craft a personal folklore to explain motives, show off weapons, or say goodbye to family and friends. Anonymity becomes notoriety with a real-time manifesto packaged for public consumption.

Facebook may have removed the video of Godwin’s killing, but it is still easily found elsewhere, especially on TV. For his family, and the horror they endure each time it is replayed, Facebook can’t simply say it needs to do better. The company must move quickly and decisively toward a result that protects users and prohibits a heinous crime from becoming just another viral fascination.