The Justice Department says social media companies such as YouTube and Twitter shouldn’t be held liable for hosting other people’s content, but also says the tech giants may cross lines when they insert themselves into the situation by deciding what to promote.
The department laid out its position in briefs to the Supreme Court in two cases to be heard next week involving claims that the companies should have done more to prevent the growth of the Islamic State, or ISIS.
The cases were brought by relatives of people who died in terrorist attacks for which ISIS has claimed responsibility.
Tech companies argue they are protected by Section 230 of the Communications Decency Act, which in general immunizes them from liability for content someone else has posted.
But the Biden administration says that law doesn’t offer an absolute shield.
“Section 230(c)(1) protects an online platform from claims premised on its dissemination of third-party speech, but the statute does not immunize a platform’s other conduct, even if that conduct involves the solicitation or presentation of third-party content,” Acting Solicitor General Brian Fletcher wrote.
The two cases present different legal questions for the justices, though both involve the Antiterrorism Act of 1990, which gives Americans a right to sue for damages in instances of terrorism.
In the case against Google, the parent of video-sharing service YouTube, relatives of Nohemi Gonzalez, an American who was killed in the November 2015 terrorist attack in Paris, say ISIS posted content on YouTube, which wasn’t diligent enough in removing it, and indeed in some instances would actually “recommend” ISIS videos to users.
That meant YouTube “assists ISIS in spreading its message,” the Gonzalez family argued in their brief.
The other case was brought by U.S. relatives of Nawras Alassaf, a Jordanian who was killed when a man went on an ISIS-inspired mass shooting at a nightclub in Istanbul, killing 39 people and wounding 69 others. The family said Twitter, Facebook and Google aided and abetted ISIS by hosting its content and, in some cases, deriving ad revenue from it.
An appeals court sided with Google in the YouTube case, but ruled against Twitter and allowed the second case to proceed.
The Justice Department said both rulings got it wrong in some respects and suggested the high court keep Section 230 protections in place but make clear that a technology company becomes more than a host when it actively seeks to promote content.
That’s generally done through algorithms that suggest content to other users.
“When YouTube presents a user with a video she did not ask to see, it implicitly tells the user that she ‘will be interested in’ that content ‘based on the video and account information and characteristics,’ ” the solicitor general told the high court.
The meaning of Section 230
Court watchers wondered whether the cases would give the justices a chance to upend nearly three decades of protections major tech companies have enjoyed.
Curt Levey, president of the Committee for Justice, a conservative advocacy nonprofit, said the Justice Department’s approach offered a different path.
“Clearly, whoever was in charge of deciding positions here had an agenda of splitting the baby,” Mr. Levey said. “A middle ground also kind of makes sense if you want to give the justices a chance to do something about Big Tech without going all the way.”
The case turns on what Section 230 means.
Bob Nelon, a partner at the law firm Hall Estill, said the original sponsors of the legislation — Sen. Ron Wyden, Oregon Democrat, and former Rep. Christopher Cox, California Republican — filed a brief with the justices noting they considered algorithms when writing the law.
“As Google and others pointed out in briefs, Congress has amended the statute several times since its original adoption in 1996, knowing full well how platforms use algorithms and not expressly removing their immunity,” Mr. Nelon said.
“The platforms argue, with some credence, that if they can’t use algorithms, then either the internet becomes an unmanageable mess of an incomprehensible mass of information, in which users have no ability to find the information they seek, or the platforms have to limit what is posted to only that which is the mildest content that would not risk liability because it could not be argued to offend anyone,” he said.
At least four justices had to vote to take up the cases. Although the vote was not revealed, Justice Clarence Thomas was likely among them. He had signaled in a 2020 statement that he was looking for “an appropriate case” to delve into Section 230.
He said that section of law was written in 1996, at the “dawn of the dot-com era,” when the big legal questions were whether chat rooms could be held liable for what users post. Section 230 was meant to offer protections to companies, saying they weren’t the actual publishers and as long as they didn’t knowingly allow illegal third-party content they were safe from liability.
But Justice Thomas said in the years since, lower courts have expanded that into “sweeping immunity” for tech companies and it was time the justices took a look.
The cases come to the justices at a time when the tech giants are under intense scrutiny for their handling of deeply divisive political debates, including the COVID-19 pandemic and the 2020 election.
Both Democrats and Republicans on Capitol Hill have called for updating Section 230, but there’s little agreement on how to do it.
States are moving ahead, however. Texas and Florida have enacted laws that would allow individuals or the state’s attorney general to sue large social media platforms for squelching a viewpoint. Litigants have already petitioned the justices to take up those cases.
The high court has asked the federal government to weigh in on those cases as well, while the justices decide whether to grant review.
Mr. Levey said these cases could get swept up in the question over Section 230 liability the justices will decide this term.
“The 230 case will give us some hint about what the court will later do in the Texas and Florida cases,” he said. “Are the justices as unhappy with the status quo as the vast majority of the public is?”
A ruling from the high court in both the Google and Twitter cases is expected by the end of June.