Online Safety Groups on CDA 230

 
Online Safety Groups CDA 230 House Letter 090821.jpg
Online Safety Groups CDA 230 House Letter 090821-3.jpg
Online Safety Groups CDA 230 House Letter 090821-2.jpg
Online Safety Groups CDA 230 House Letter 090821-4.jpg

8 Sept 2021

TO: Congressman Frank Pallone
Chairman, House Energy and Commerce Committee

Congresswoman Cathy McMorris Rodgers
Ranking Member, House Energy and Commerce Committee

Congressman Mike Doyle
Chairman, Subcommittee on Communications and Technology

Congressman Robert E. Latta
Ranking Member, Subcommittee on Communications and Technology

Congresswoman Jan Schakowsky
Chair, Subcommittee on Consumer Protection and Commerce

Congressman Gus Bilirakis
Ranking Member, Subcommittee on Consumer Protection and Commerce

Dear Chairman Pallone, Ranking Member McMorris Rodgers, Chairman Doyle, Ranking Member Latta, Chair Schakowsky, and Ranking Member Bilirakis,

As an alliance of organizations committed to public safety, consumer protection, and reducing serious crime, we share your Committee’s view that the lack of platform accountability under Section 230 of the Communications Act is putting the public in jeopardy. The time has come for Congress to amend Section 230 as well as to adopt transparency provisions regarding content moderation. Toward that end, we describe below a number of principles we hope to see reflected in any such legislation.

Judicial & Public Support. A large part of the problem, as courts are increasingly acknowledging, is that “Section 230(c)(1) shelters more activity than Congress envisioned it would.” Which is why courts are calling on Congress to reconsider “[w]hether social media companies should continue to enjoy immunity for the third-party content they publish.”

The public is similarly concerned. Almost two-thirds of Americans polled say that the law should be changed to make platforms responsible for content on their sites and about three-quarters say platforms should collaborate to identify and ban criminals and bad actors. Even technology employees agree, with 71 percent surveyed who knew what Section 230 is supporting reform.

Congress passed section 230 and overturned Stratton Oakmont v. Prodigy to limit platforms’ liability when they do the right thing, not to immunize them when they ignore illegal behavior or even profit off it. Despite this, courts have ruled that the broad language of Section 230 prevents holding platforms liable even when they negligently, recklessly, or knowingly facilitate unlawful activity by their users. That removes the common law duty of reasonable care that ordinarily applies to all businesses, along with the legal incentive platforms would otherwise have to curb harmful behavior. In the process, it denies victims access to the courthouse steps.

Reform, Not Repeal. Despite the scare-tactic claims of many tech firms and their allies, reform necessitates neither the repeal of Section 230 nor the end of free speech on the Internet. In fact, we agree with Section 230 advocates that the content moderation safe harbor in subsection (c)(2) helps platforms serve as avenues of free expression while combatting toxic and illegal behavior on their services. The problem with Section 230 is not subsection (c)(2), but subsection (c)(1).

We therefore recommend that Congress preserve the subsection (c)(2) safe harbor but restore for platforms the common law duty of care. Congress could do so by requiring platforms to take reasonable steps to combat unlawful behavior as a condition of receiving Section 230’s liability limitations. Such an approach would continue to fix Stratton Oakmont’s disincentive to moderate unlawful behavior while addressing the blank check that Section 230 currently grants platforms to act irresponsibly.

The reasonableness standard does not require platforms to be perfect, and ensures they can continue to host a variety of expression, so long as they take meaningful steps to curb unlawful activity.

Restoring the reasonableness standard would also ameliorate competition concerns by ensuring that platforms are once again subject to the same duty of care that applies to their brick-and-mortar rivals.

Forward Looking. We recognize the desire to address specific harms, and many of us support a variety of issue-specific bills aimed at today’s most pressing issues. But Congress cannot possibly pass separate Section 230 bills for every conceivable problem or crime sector. Restoring the general duty of reasonable care would help address today’s other, innumerable harms online, as well as those that may arise in the future.

Flexible, Not One-Size-Fits-All. Not all platforms have the enormous resources of Facebook, YouTube, and Twitter. Nor are all platforms as likely to have as much unlawful behavior occurring over their services as these tech giants currently host. At the same time, the mere fact that a platform is smaller should not give it license to act irresponsibly. To the harmed victim, platform size is irrelevant.

Conditioning Section 230’s liability limitations on taking reasonable steps to combat unlawful activity will ensure courts consider the resources available to a specific platform and the risk of a given harm. Such an approach will also allow platforms to experiment with a variety of solutions, including in combination, and tailor them to the specific needs of their services and those that use them. This also avoids the need for Congress to bless or require specific best practices, which will likely evolve over time as technology and the problems of the day evolve.

Promoting Transparency. Platforms should have an obligation to tell people what content and behavior the platforms will allow and prohibit, so users and potential users know what they can do on the service and what they will be exposed to. Congress should therefore adopt transparency provisions requiring platforms to make public:

  • what content the platforms will take down and leave up;

  • how users can file complaints about deviations from those policies;

  • how people can appeal the platforms’ decisions to leave up or take down content under those policies; and

  • information regarding the amount and types of harmful behavior occurring on their services, as well as about the number and types of complaints, takedowns, denial of takedown requests, appeals, and appeal results.

A platform that fails to abide by its own terms of service or the transparency requirements in a particular circumstance would lose the Section 230 shield in that circumstance, and also potentially face liability for breach of contract or an unfair and deceptive trade practice.

Importantly, the information regarding harmful behavior, complaints, takedowns, denial of takedown requests, and appeals should be made available—subject to certain privacy protections—in a manner that governmental and non-governmental organizations can access and analyze. That would enable those organizations to apply their particular expertise to track certain trends and to help the platforms, law enforcement, and Congress address concerns over time.

The Internet Is Part of the Real World. The mere fact that activity occurs online does not change the fact that it can cause real-world harm. All businesses, whether “virtual” or “brick and mortar,” should have an obligation to act responsibly—and be held accountable when they don’t. Because of section 230, online platforms escape that obligation. We look forward to working with you to preserve the benefits of Section 230 while fixing the flaws that are causing widespread harm.

Sincerely,

Gretchen Peters
Executive Director, Alliance to Counter Crime Online

On behalf of:

Advocating for You

Alexander Neville Foundation

Athar Project

Center on Illicit Networks & Transnational Organized Crime

Counter Extremism Project

Humane Society Legislative Fund

Lady Freethinker

Liberty Shared

OceansAsia

World Parrot Trust

Victims of Illicit Drugs (VOID)