Log In


Reset Password
  • MENU
    Local
    Saturday, May 04, 2024

    Tech's liability shield under fire: 26 words and what's at stake

    Facebook headquarters in Menlo Park, Calif. (Jeff Chiu/AP Photo)

    Democrats and Republicans in Congress are taking aim at a controversial law that shields internet platforms including Facebook and Twitter from lawsuits over content posted by users.

    The measure -- just 26 words known as Section 230 -- now faces its biggest reckoning since it was included in the Communications Decency Act of 1996. Calls to revise it grew in the months before the November election and intensified after the deadly attack on Congress by then-President Donald Trump's loyalists.

    Trump and his GOP allies claim Section 230 gives companies leeway to censor conservative speech, an assertion he repeated on Sunday at a right-wing gathering in Florida. Democrats accuse the same Internet platforms of failing to curb misinformation and hate speech, arguing that Trump's posts on election fraud fueled the Jan. 6 Capitol insurrection.

    Even some on Wall Street are pointing fingers at the shield after market turbulence caused by a horde of retail investors using online chat forums targeted stocks like GameStop Corp.

    While industry lobbyists have urged a cautious approach, a House panel has already summoned the chief executive officers of Facebook, Google and Twitter to testify at a March 25 virtual hearing on misinformation and disinformation on their platforms. Facebook CEO Mark Zuckerberg has called for more regulation of the internet and said he's open to reforming Section 230.

    Still, even with Democrats controlling Congress, any bill would need bipartisan support in the Senate to clear the 60 vote threshold to advance legislation. That means lawmakers will need to negotiate and compromise at a time when they have been deeply divided.

    New measures to redraw Section 230 are expected in the coming weeks. Here's a guide to the proposals on the table:

    - Hate speech and civil rights:

    SAFE TECH ACT: The Safeguarding Against Fraud, Exploitation, Threats, Extremism and Consumer Health (SAFE TECH) Act was the first Section 230 bill introduced in the Senate this year. Released on Feb. 5 by Democratic Sens. Mark Warner of Virginia, Mazie Hirono of Hawaii, and Amy Klobuchar of Minnesota, the bill has no Republican support to date.

    HIGHLIGHTS: The legislation would hold tech companies liable for content falling within four categories: civil rights, international human rights, antitrust and stalking, harassment or intimidation. It would clarify that companies can be held liable for wrongful death actions, meaning families could sue platforms that may have contributed to a person's death.

    The measure would dramatically change the underlying law to limit companies' liability protections by treating them as a publisher of any paid content on their platforms. That includes advertising that generates rich profits for Google, Twitter and Facebook. It narrows the liability provision to cover only third-party "speech," instead of the catchall term "information" in the original law. It would also allow victims to seek court orders when a company fails to address material that's "likely to cause irreparable harm."

    SUPPORT AND OPPOSITION: The NAACP Legal Defense and Educational Fund and the Anti-Defamation League have backed the bill.

    NetChoice, which represents large tech companies like Facebook and Google, opposes the bill, saying it "guts" Section 230.

    "Not only would the bill chill free speech on the internet, it would also revoke Section 230 protections for all e-commerce marketplaces" such as Etsy Inc., said Carl Szabo, the group's vice president and general counsel, in a statement. "Small sellers across the country would lose access to customers all over the world at a time when entrepreneurs need that access most."

    WHAT'S NEXT: Rep. Yvette Clarke, a Democrat from New York, is working on a more narrowly focused bill known as the Civil Rights Modernization Act. That would amend Section 230 to ensure federal civil rights laws apply to tech companies' targeted advertisements in an effort to stop the spread of hate speech online. Clarke said in an interview she wants to uncover how the platforms promote civil rights violations and ensure they curb hate speech "so it doesn't get to the point of harm to the American people or American institutions." She plans to introduce the measure in several weeks.

    Democratic Representatives Anna Eshoo of California and Tom Malinowski of New Jersey are planning to reintroduce the Protecting Americans from Dangerous Algorithms Act. The bill would remove a platform's liability shield if its algorithm is used to amplify or recommend content that incites hate speech, violence or acts of terrorism. "These companies have shown they won't do the right thing on their own," Eshoo told Bloomberg.

    - Content moderation and consumer rights:

    PACT ACT: The bipartisan Platform Accountability and Consumer Transparency (PACT) Act was introduced in the Senate in June 2020. Sen. Brian Schatz, a Democrat from Hawaii, and Sen. John Thune, a Republican from South Dakota, cosponsored the bill.

    HIGHLIGHTS: It would require "large online platforms" to remove content within 24 hours if notified of a court determination that the content is illegal. Companies would be required to issue quarterly reports, including data on content that's been removed, demonetized, or deprioritized. It would also allow consumers to appeal content-moderation decisions. The legislation would allow the U.S. Justice Department, Federal Trade Commission and state attorneys general to pursue civil lawsuits for online activity.

    SUPPORT AND OPPOSITION: The measure is supported by the Alliance for Safe Online Pharmacies, which works to combat illegal online pharmacies. NetChoice and digital rights group Electronic Frontier Foundation oppose the bill.

    The Internet Association, which represents companies including Amazon.com, Google and Facebook, said it appreciates the bill's effort to promote transparency and accountability in content moderation, but raised concerns about the broad reporting requirements and said the measure should be narrowed to exclude smaller internet companies. The group said the highly detailed requirements would be "extremely burdensome."

    WHAT'S NEXT: The PACT ACT is expected to be reintroduced this month, according to a person familiar with the matter.

    In the House, Rep. Jan Schakowsky, D-Ill., is expected to introduce the Online Consumer Protection Act within weeks. As chair of a House Energy and Commerce subcommittee overseeing consumer protection issues, Schakowsky would lead any effort to reform how Section 230 impacts consumer safety. Her measure, which was circulated in draft form last year, would remove liability protections if platforms violate their terms of service and allow for FTC enforcement and consumer lawsuits.

    The bill would require social media companies and online marketplaces to create consumer protection policies that define whether content can be blocked, removed, or modified. The policy would also need to describe how a user will be notified if content is being removed and how to appeal a removal. Schakowsky said her bill would ensure "that consumer rights in the physical world extend to the virtual world."

    - Child Exploitation:

    EARN IT ACT: The bipartisan Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act was introduced in the Senate in March 2020, and was advanced by the Senate Judiciary Committee. Sens. Richard Blumenthal, D-Conn., and Lindsey Graham, R-S.C., introduced the bill last year.

    HIGHLIGHTS: It would allow for state civil and criminal lawsuits as well as federal civil lawsuits if companies advertise, promote, present, distribute or solicit child sexual abuse material. The legislation would also establish a National Commission on Online Child Sexual Exploitation Prevention that would create voluntary best practices for the industry. An amendment last Congress removed the original language that conditioned the liability protection on companies enacting the best practices.

    SUPPORT AND OPPOSITION: The bill is supported by sex trafficking survivor groups including the National Center for Missing & Exploited Children, and the National Center on Sexual Exploitation.

    The Internet Association said it supports the goal of ending child exploitation online, but said the bill would "create a harmful lack of coherence" with state laws and said it plans to work with lawmakers on improvements to the bill.

    WHAT'S NEXT: The bill is expected to be reintroduced this Congress, according to a Blumenthal spokesperson. Senator Dick Durbin, from Illinois, has backed the bill. As the new chairman of the Judiciary Committee, Durbin could shepherd it this Congress.

    The Internet Association said that Section 230 strikes a "careful balance" between protecting companies from lawsuits and encouraging them to proactively remove hate and extremist speech online. Removing the protections would create a disincentive for companies to moderate any content for fear of being sued, the group says.

    The group also says legislation often can't keep up with the evolving nature of the internet, and that onerous legal requirements could run startups out of business.

    "Imposing overly prescriptive and burdensome requirements through legislation or regulations will negatively impact the internet ecosystem," the trade group told Congress in testimony last year.

    Still, many of the companies recognize that some change to the measure is inevitable and are prepared to work with lawmakers to help hammer out proposals -- also in the interest of avoiding more draconian measures.

    Comment threads are monitored for 48 hours after publication and then closed.