This week, the Supreme Court is considering a short but powerful law that could change the internet.
In recent years, Section 230 of the Communications Decency Act has become a controversial issue because it protects internet companies from liability for user-generated content.
Gonzalez v. Google was argued Tuesday by the Supreme Court. The family of Nohemi Gonzalez, a victim of the 2015 Islamic State attacks in Paris, sued Google for promoting terrorist content on YouTube before the attack.
On Wednesday, the court will hear a parallel case accusing Twitter of another deadly terrorist attack, this time the 2017 Istanbul nightclub shooting that killed Nawras Alassaf.
Both cases allege that tech platforms should be held liable for hosting or promoting Islamic State content before attacks that killed over 150 people.
The petitioner argued that YouTube’s recommendation algorithm serves users content, which isn’t protected by Section 230.
“We’re focusing on the recommendation function, that they are affirmatively recommending or suggesting ISIS content, and it’s not mere inaction,” said Gonzalez’s family lawyer Eric Schnapper in Tuesday’s oral arguments.
Section 230 exceptions are controversial but not new. FOSTA, a 2018 bill intended to reduce sex trafficking, has been criticized for making sex work more dangerous.
The Supreme Court isn’t the only government body examining Section 230, but Congress has largely failed to repeal or limit its protections.
On Tuesday, some justices questioned whether the Supreme Court should review the internet law.
“We’re a court, we really don’t know about these things,” Justice Elena Kagan said. “These are not the nine greatest internet experts.”
As Schnapper continued, the justices seemed confused and both sides tried to clarify. Schnapper’s main point was to distinguish between failing to remove dangerous content—a statistical inevitability given how much content online platforms host—and promoting and spreading it:
“Our view is, if the only wrong alleged is the failure to block or remove, that would be protected by 230(c)(1). But — but that’s — the 230(c)(1) protection doesn’t go beyond that. And the theory of protecting the — the website from that was that the wrong is essentially done by the person who makes the post, the website at most allows the harm to continue. And what we’re talking about when we’re talking about the — the website’s own choices are affirmative acts by the website, not simply allowing third-party material to stay on the platform.”
The justices explored two hypothetical extremes to determine what should reasonably be protected by Section 230: that platforms that use algorithms should be allowed to deliberately promote illegal content or that they should not be allowed to make any algorithmic recommendations.
“Let’s assume we’re looking for a line,” Justice Sotomayor said.
To add to the confusion, Schnapper repeatedly called the platform’s algorithmic recommendations “thumbnails,” which are more commonly known as YouTube video previews.
Some justices cautioned that removing 230 protections from algorithmic recommendations would also give search engines that rank search results the same treatment.
“So even the straight search engine could be liable for their prioritization system?” Kagan asked.
The justices worried about Section 230’s second-order effects.
“You’re asking us right now to make a very precise predictive judgment that, don’t worry about it, it’s really not that bad,” Justice Brett Kavanaugh said. “I don’t think that’s true, and I don’t know how to evaluate that.”
Those reservations were nearly universal among the justices, who did not appear eager to shake up the status quo—a perspective we can expect to see again during Wednesday’s oral arguments, which will stream live.
“We’re talking about significant liability in litigation and up to this point, people have focused on the [Anti-terrorism Act] because that’s the one point that’s at issue here,” Chief Justice John Roberts said.
“But I suspect there would be many, many more defamation and discrimination suits… Terrorism support seems like a small part of everything else. Why not?