It took the US Supreme Court nearly six hours over two days to hear arguments on a pair of cases with massive consequences for the Internet. That’s far longer than usual, but then, it’s always wise to move slowly when crossing a minefield.
With a single step, the high court could potentially demolish the business models that prop up the giants of social media. That’s what may happen if the justices rule that YouTube and Twitter bear responsibility for acts of terrorism, because they didn’t do enough to prevent murderous fanatics from using their services.
On Tuesday, the court heard from the family of Nohemi Gonzalez, a US citizen who was one of 130 people killed in Paris by the Islamic State terrorist group in 2015. The family blames YouTube for allowing the publication of radical Islamic recruiting videos, and for using software algorithms to deliver a steady supply of these videos to extremists, helping to further radicalize them.
On Wednesday, it took up the case of Mehier Taamneh, who lost a relative in a 2017 terrorist attack in Istanbul. The plaintiff alleges that social networks Twitter, Facebook, and YouTube were aware that terrorists used their services to plan the deed, but didn’t do everything in their power to kick them out.
The plaintiffs’ arguments have a visceral appeal. Why should giant corporations tolerate their users’ criminal behavior? But it’s not so simple.
The Gonzalez case runs headlong into Section 230 of the Communications Decency Act, the 1996 law that shields online companies from responsibility for the messages posted by users. And the Taamneh matter raises the question of whether social media companies can be charged with aiding and abetting terrorism simply because they didn’t expel every bad actor from their networks.
After two days of probing questions from the justices, it seems unlikely that the justices will make any radical moves.
“They don’t know what to do” about Section 230, said Jeff Kosseff, associate professor of cybersecurity law at the US Naval Academy, after the first day’s hearing. “I don’t think we came out of the argument with any clear consensus.”
Even Justice Clarence Thomas, who has openly called for reevaluating Section 230, seemed to be having second thoughts. In an exchange with Gonzalez family attorney Eric Schnapper, Thomas argued that YouTube’s recommendation algorithms were a neutral technology that simply suggested that users check out videos that might interest them.
“I don’t understand how a neutral suggestion about something you’ve expressed an interest in is aiding and abetting,” Thomas said.
And on Wednesday, Justice Samuel Alito noted that Twitter and other social media networks, as a matter of policy, try to weed out postings that encourage criminal activity. Alito cast doubt on the idea that the companies could be blamed because they weren’t always successful.
“The failure to do more or even a lot more to enforce those policies,” said Alito, “does not amount to the knowing provision of substantial assistance.”
After Wednesday’s hearing in the Taamneh case, Sarah Sobieraj, a sociology professor at Tufts University, predicted that Twitter and the other defendants have little cause for concern.
“It doesn’t sound as if they’re inclined to rule that Twitter is liable,” said Sobieraj, who’s also a faculty associate at Harvard University’s Berkman Klein Center for Internet & Society.
Chris Marchese, counsel for NetChoice, a trade association that represents the companies on trial in the two cases, seemed confident of victory. “It is clear that the court is very much concerned about unintended consequences,” he said.
For instance, Justice Amy Coney Barrett on Tuesday wondered if a casual Twitter user could be held liable for aiding and abetting terrorism for merely retweeting an Islamic State video, if Section 230 protections are revoked. Marchese predicted that the tech titans will come out on top in both cases.
However, James Grimmelmann, professor of law at Cornell Law School, suggested that the court might come up with a surprise in the Gonzalez case and rule that Section 230 does not shield YouTube from lawsuits. But Grimmelmann said the plaintiffs would still stand no chance of winning an anti-terrorism suit against the company, because merely distributing the videos is not an act of terrorism.
“They could get past Section 230,” said Grimmelmann, “but on such a narrow basis that they’d never win.”
The problem, said Grimmelmann, is that such a ruling could open the floodgates for all manner of lawsuits against social media alleging all sorts of harms. He predicted that if the court takes this option, it will write that any such suits should be viewed skeptically by lower courts. But he admitted there’s no guarantee that lower courts will heed this advice, and some might rule against the tech companies.
Grimmelmann also said it might make sense for Congress, which enacted Section 230 nearly three decades ago, to come up with its own modifications to the law. But he held out little hope for such an outcome, given today’s bitter partisan divisions.
“Congress today is more focused on the issue,” said Grimmelmann, and yet it’s “less capable of addressing it well.”
Hiawatha Bray can be reached at email@example.com. Follow him on Twitter @GlobeTechLab.