Supreme Court to Hear Case That Targets a Legal Shield of Tech Giants



Nohemi Gonzalez, a 23-year-old California college student, was studying abroad in Paris in November 2015 when she was among the 130 people killed in a coordinated series of terrorist attacks throughout the city.

The next year, her father sued Google and other tech companies. He accused the firms of spreading content that radicalized users into becoming terrorists, and said they were therefore legally responsible for the harm inflicted on Ms. Gonzalez’s family. Her mother, stepfather and brothers eventually joined the lawsuit, too.

Their claims will be heard in the U.S. Supreme Court on Tuesday. And their lawsuit, with Google now the exclusive defendant, could have potentially seismic ramifications for the social media platforms that have become conduits of communication, commerce and culture for billions of people.

Their suit takes aim at a federal law, Section 230 of the Communications Decency Act, which shields online platforms like Facebook, Instagram and Google’s YouTube from lawsuits over content posted by their users or their decisions to take content down. The case gives the Supreme Court’s justices the opportunity to narrow how that legal shield is applied or to gut it entirely, potentially opening up the companies to liability for what users post and to lawsuits over libel, discriminatory advertising and extremist propaganda.

A day after hearing the Gonzalez v. Google case, the court is scheduled to hear a second tech lawsuit, Twitter v. Taamneh, over whether Twitter has contributed to terrorism.

What the Supreme Court ultimately decides on the cases will add to a pitched battle around the world over how to regulate online speech. Many governments say that social networks have become fertile ground for hate speech and misinformation. Some have required the platforms to take down those posts. But in the United States, the First Amendment makes it difficult for Congress to do the same.

Critics of Section 230 say that it lets the tech companies avoid responsibility for harms facilitated on their watch. But supporters counter that without the legal shield, the companies will take down more content than ever to avoid lawsuits, stifling free expression.

  • Cutting Back: Job cuts in the social media industry reflect a trend that threatens to undo many of the safeguards that platforms have put in place to ban or tamp down on disinformation.
  • A Key Case: The outcome of a federal court battle could help decide whether the First Amendment is a barrier to virtually any government efforts to stifle disinformation.
  • A Top Misinformation Spreader: A large study found that Steve Bannon’s “War Room” podcast had more falsehoods and unsubstantiated claims than other political talk shows.
  • Artificial Intelligence: For the first time, A.I.-generated personas were detected in a state-aligned disinformation campaign, opening a new chapter in online manipulation.

The Supreme Court case “can have an impact on how those companies do business and how we interact with the internet, too,” said Hany Farid, a professor at the school of information at the University of California, Berkeley. He filed a brief with the Supreme Court supporting the Gonzalez family members who are suing Google.

Ms. Gonzalez, a first-generation college student who was studying design at California State University, Long Beach, was killed while out with friends during the Paris attacks in 2015. The Islamic State later claimed responsibility. She was the only American killed.

Her father, Reynaldo Gonzalez, sued Google, Facebook and Twitter in 2016, arguing the platforms were spreading extremist content. That included propaganda, messages from the Islamic State’s leaders and videos of graphic violence, he said. Citing media reports, the lawsuit mentioned specific videos that showed footage of Islamic State fighters in the field and updates from a media outlet affiliated with the group. The online platforms didn’t do enough to keep the terrorist group off their sites, the lawsuit said.

>
>

YouTube and other platforms say they screen for such videos and take down many of them. But in 2018, research that was based on a tool developed by Mr. Farid found that some Islamic State videos stayed up for hours, including one that encouraged violent attacks in Western nations.

Facebook and Twitter were removed as defendants in the lawsuit in 2017, the same year Ms. Gonzalez’s mother stepfather and siblings joined plaintiffs. Last year, a federal appeals court ruled that Google did not have to face the Gonzalez family members’ claims because the company was protected by Section 230.

In May, lawyers for Ms. Gonzalez’s family asked the Supreme Court to step in. By using algorithms to recommend content to users, the lawyers argued, YouTube was essentially engaging in its own form of speech, which was not protected by Section 230.

Ms. Gonzalez’s father and the plaintiffs in the Twitter case declined to comment through their lawyer, Keith Altman. Mr. Altman said that courts had “pushed the limits” of the Section 230 legal shield to the point that it was “unrecognizable.” A lawyer for Ms. Gonzalez’s other family members did not respond to a request for comment. The lawyer who will argue both cases before the Supreme Court, Eric Schnapper, also declined to comment.

Google has denied the Gonzalez family’s arguments about Section 230. It has said that the family’s claims that Google supported terrorism are based on “threadbare assertions” and “speculative” arguments.

In Congress, efforts to reform Section 230 have stalled. Republicans, spurred by accusations that internet companies are more likely to take down conservative posts, proposed tweaking the law. Democrats said the platforms should take more content down when it spreads misinformation or hate speech.

Instead, courts started exploring the limits to how the law should be applied.

In one case in 2021, a federal appeals court in California ruled that Snap, the parent of Snapchat, could not use Section 230 to dodge a lawsuit involving three people who died in a car crash after using a Snapchat filter that displayed a user’s speed.

Last year, a federal judge in California said that Apple, Google and Meta, Facebook’s parent, could not use the legal shield to avoid some claims from consumers who said they were harmed by casino apps. A federal judge in Oregon also ruled that the statute didn’t shield Omegle, the chat site that connects users at random, from a lawsuit that said an 11-year-old girl met a predator through its service.

Tech companies say it will be devastating if the Supreme Court undercuts Section 230. Halimah DeLaine Prado, Google’s general counsel, said in an interview in December that the protections had been “crucial to allowing not just Google but the internet to flourish in its infancy, to actually become a major part of the broader U.S. economy.”

“It’s critically important that it stands as it is,” she said.

A spokesman for Meta pointed to a blog post where the company’s top lawyer said the case “could make it much harder for millions of online companies like Meta to provide the type of services that people enjoy using every day.”

Twitter did not respond to a request for comment.

Activists have raised concerns that changes to the law could cause the platforms to crack down on content posted by vulnerable people. In 2018, a new law ended the protections of Section 230 when platforms knowingly facilitated sex trafficking. The activists say that caused sites to take down content from adult sex workers and posts about L.G.B.T.Q. people.

The Gonzalez case has also attracted interest from the Justice Department. In a December brief, the agency told the Supreme Court it believed that Section 230 “does not bar claims based on YouTube’s alleged targeted recommendations of ISIS content.” The White House has said the legal shield should be removed.

Mr. Farid acknowledged it was possible the court could gut the Section 230 protections, leading to unintended consequences. But he noted that the social networks already comply with laws governing how they treat certain types of content, like German restrictions on digital hate speech. He said they could navigate narrow changes to the legal shield, too.

“The companies figured it out,” he said.