Log In


Reset Password
  • MENU
    Nation
    Thursday, May 02, 2024

    Supreme Court rules for Google, Twitter on terror-related content

    The Supreme Court building is seen on Capitol Hill in Washington, Jan. 10, 2023. Confidence in the Supreme Court sank to its lowest point in at least 50 years in 2022, in the wake of the Dobbs decision that led to state bans and other restrictions on abortion. That's according to the General Social Survey, a long-running and widely respected survey conducted by NORC at the University of Chicago that has been measuring confidence in the court since 1973, the same year that Roe v. Wade legalized abortion nationwide. (AP Photo/Patrick Semansky, File)
    Reynaldo Gonzalez breaks down while remembering his daughter Nohemi Gonzalez, who was killed in the Paris attacks in November, at her funeral at the Calvary Chapel in Downey, Calif., Dec. 4, 2015. The Supreme Court on Thursday, May 18, 2023, sidestepped a case against Google that might have allowed more lawsuits against social media companies. The justices' decision returns to a lower court the case from the family of Nohemi Gonzalez. The family wants to sue Google for YouTube videos they said helped attract ISIS recruits and radicalize them. Google owns YouTube. (Genaro Molina/Los Angeles Times via AP, Pool, File)

    WASHINGTON - The Supreme Court ruled Thursday that the families of terrorism victims had not proved Google, Twitter and Facebook helped foster attacks on their loved ones, and handed a greater victory to the tech industry by declining to weigh in on a protective internet law at the center of the debate over social media regulation.

    The families "never allege that, after defendants established their platforms, they gave ISIS any special treatment or words of encouragement," Justice Clarence Thomas wrote for a unanimous court. "Nor is there reason to think that defendants selected or took any action at all with respect to ISIS' content (except, perhaps, blocking some of it)."

    The case involved allegations against Twitter, Facebook and Google, which owns YouTube. The court adopted similar reasoning in a separate lawsuit against Google filed by a different family.

    The narrowly focused rulings sidestepped requests to limit Section 230, a legal provision that protects social media platforms from lawsuits over offensive, harmful or violent content posted by their users, regardless of whether companies incentivize or promote those posts. The statute has emerged as a lightning rod in the politically polarized debate over the future of online speech.

    Tech companies and their surrogates celebrated the ruling, which followed extensive lobbying and advocacy campaigns to defend Section 230 in Washington. Changes to the law, they said, could open a floodgate of litigation that would quash innovation and have wide-ranging effects on the technology that underlies almost every interaction people have online, from innocuous song suggestions on Spotify to prompts to watch videos about conspiracy theories on YouTube

    The claim against Google specifically focused on whether Section 230 protects recommendation algorithms. The family of Nohemi Gonzalez argued Google-owned YouTube was responsible for the 23-year-old exchange student's death in Paris at the hands of ISIS gunmen because the tech platform acted as a recruiting platform for the terrorist group.

    "Countless companies, scholars, content creators and civil society organizations who joined with us in this case will be reassured by this result," Google general counsel Halimah DeLaine Prado said in a statement. "We'll continue our work to safeguard free expression online, combat harmful content, and support businesses and creators who benefit from the internet."

    In the Twitter v. Taamneh case, American relatives of Nawras Alassaf said the company failed to properly police its platform for Islamic State-related accounts in advance of a Jan. 1, 2017, attack at the Reina nightclub in Turkey that killed Alassaf and 38 others.

    The relatives in both the Taamneh and Gonzalez cases based their lawsuits on the Anti-Terrorism Act, which imposes civil liability for assisting a terrorist attack. At issue was whether the company provided substantial assistance to the terrorist group.

    But Thomas, writing in the Twitter case, said the link was too attenuated.

    "As alleged by plaintiffs, defendants designed virtual platforms and knowingly failed to do 'enough' to remove ISIS-affiliated users and ISIS related content - out of hundreds of millions of users worldwide and an immense ocean of content - from their platforms," he wrote. "Yet, plaintiffs have failed to allege that defendants intentionally provided any substantial aid to the Reina attack or otherwise consciously participated in the Reina attack - much less that defendants so pervasively and systemically assisted ISIS as to render them liable for every ISIS attack."

    Thomas also made clear that algorithms to direct those looking for ISIS content are not evidence of complicity by the media companies.

    "As presented here, the algorithms appear agnostic as to the nature of the content, matching any content (including ISIS' content) with any user who is more likely to view that content," Thomas wrote. "The fact that these algorithms matched some ISIS content with some users thus does not convert defendants' passive assistance into active abetting."

    The Supreme Court's action reversed a federal appeals court decision that had let the Taamneh suit go forward. A lawyer for the Gonzalez family said they would consider a suggestion in the opinion that the lawsuit could be amended to try to comply with the ruling.

    Section 230 has been denounced by politicians from both parties. Lawmakers in Congress have spent years debating whether the 1996 law needs to be updated to address their fears about social media. But most bills that would make comprehensive changes have languished amid partisan divisions.

    Democrats, wary of the ways social media has been weaponized to spread falsehoods about elections and public health, want to change the provision to ensure that tech companies have more responsibility for harmful and offensive content on their websites. Republicans are concerned that Section 230 protects companies from lawsuits over decisions to remove content or suspend accounts, especially since the companies took the historic step of suspending President Donald Trump and individuals involved in the Jan. 6, 2021, attacks on the U.S. Capitol. (Meta, YouTube and Twitter have reinstated the former president's account in recent months).

    It was clear at oral arguments that the justices were reluctant to make significant changes to the law. "We're a court," Justice Elena Kagan said at the time, adding that she and her colleagues "are not like the nine greatest experts on the internet."

    Free speech advocates argued that if the court ruled in favor of the plaintiffs, social media companies would have to suppress constitutionally protected speech, adopting blunt content moderation tools that would restrict discussion about critical topics. They pointed to mistakes tech companies already make in enforcing existing rules, citing a 2021 incident where Instagram mistakenly removed content about a mosque because its systems confused that content with a designation the company uses for terrorist organizations.

    "With this decision, free speech online lives to fight another day," Patrick Toomey, deputy director of ACLU's National Security Project, said in a statement.

    The court's decision not to confront Section 230 may increase pressure on elected officials to update the law.

    "The battle will now pass to the Congress . . . which can no longer hide on the sidelines," said Nitsana Darshan-Leitner, a lawyer for the Gonzalez family. She called on lawmakers to "amend this antiquated statute."

    Sen. John Cornyn, R-Texas, agreed that the rulings put the onus back on legislators.

    "One reason [justices declined to take it up] might be that they want the Congress to do our job," he said. "It's a complex issue, and I hope we take them up on it."

    Both Trump and President Joe Biden have criticized Section 230, at times calling for it to be revoked. Momentum to change the law increased after the Jan. 6 attack, when newly empowered Democrats in the Biden administration and Congress promised to revise the legislation. But despite a flurry of congressional hearings and bills, Section 230 has remained unchanged since 2018, when Trump signed a law that allowed victims and state attorneys general to sue websites that host sex-trafficking ads.

    Sen. Ron Wyden, D-Ore., who co-wrote Section 230 as a member of the House nearly three decades ago and filed a brief in its defense with the Supreme Court, said he appreciated the "thoughtful rulings that even without Section 230, the plaintiffs would not have won their lawsuits."

    "Despite being unfairly maligned by political and corporate interests that have turned it into a punching bag for everything wrong with the internet, the law ... remains vitally important to allowing users to speak online," Wyden said in a statement.

    Tech industry-funded groups also praised the decisions. Chamber of Progress, which receives funding from Meta, Google and other companies and filed a brief supporting Google in the Gonzalez case, called the ruling in that case an "unambiguous victory for online speech and content moderation."

    "While the Court might once have had an appetite for reinterpreting decades of Internet law, it was clear from oral arguments that changing Section 230's interpretation would create more issues than it would solve," said Jess Miers, a lawyer for the group.

    Even some legal experts who submitted briefs in support of the Gonzalez family said they were pleased with the Supreme Court's opinion. Mary Anne Franks, the president of the Cyber Civil Rights Initiative, had called for the court to more narrowly interpret Section 230, saying lower courts had wrongly concluded it should provide "unconditional immunity from liability no matter how passive [tech companies] remain in the face of even easily preventable and clearly foreseeable harm."

    Thursday's Twitter opinion, she said, shows that such litigation can be decided in the regular course of the law, taking "the wind out of the sails" of industry arguments that companies need a dedicated shield to protect them from "bad" lawsuits.

    Coalition for a Safer Web, a nonprofit group that advocates for policies to remove extreme content from social media and supported the Gonzalez counsel in the case, said the decision "rewards Big Tech for bad conduct." The group expressed skepticism that Congress would change Section 230, noting that major tech companies spend millions of dollars a year on lobbying.

    "Champagne corks are popping off today in Silicon Valley," the nonprofit said in a news release.

    - - -

    The Washington Post's Cristiano Lima in Washington and Gerrit De Vynck in San Francisco contributed to this report.

    Comment threads are monitored for 48 hours after publication and then closed.