Principles to Protect Free Expression on the Internet

Author: 
Coverage Type: 

Section 230 of the Communications Act has been dubbed the “twenty six words” that created the interactive free expression of the internet: No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. This simple piece of legislation provides immunity from liability as a speaker or publisher for providers and users of an “interactive computer service” who host and moderate information provided by third-party users. It applies to major platforms like Twitter and YouTube, newspapers with comment sections, business review sites like Yelp, and every other online service or website that accepts material from users. Public Knowledge hopes to advance a dialogue by introducing a set of principles — guardrails, if you will — for lawmakers and others interested in developing or evaluating proposals to alter Section 230.

  1. Clear Due Process and Transparency: Users should have a clear idea about what content is or is not allowed on the platform, why their content was taken down, and how to avail themselves of a transparent and equitable appeals process with the platform.
  2. Protecting the Voices of Marginalized Communities: Members of marginalized communities are often subjected to harassment online, which in many cases means these voices are less likely to engage in the kind of speech that Section 230 was meant to protect in the first place. Any Section 230 reform must consider the effect it could have on these voices.
  3. Streamlined Content Moderation Process: Content moderation processes should be clear and concise and should not involve an overly legal process for content moderation decisions.
  4. One Size Does Not Fit All: Outright repeal of Section 230 would exacerbate the very thing we need most to challenge the dominance of the largest platforms — new market entrants. Policymakers can encourage market entry and promote platform competition by limiting the reforms to 230 to larger platforms or by providing some accommodation for smaller platforms.
  5. Section 230 and Business Activity: Section 230 does not protect business activities from sensible business regulation, including business activities that stem from user-generated content in some way. Most judges have reached this conclusion already but it is an area to be aware of that may require legislative clarification.
  6. Pay to Play: Section 230 was designed to protect user speech, not advertising-based business models. Platforms do not need to be shielded by Section 230 for content they have accepted payment to publish.
  7. Conduct, Not Content: Section 230 has allowed platforms to give voice to so many different political issues and movements, like the Black Lives Matter, Christian Coalition, Arab Spring, and #MeToo movements. Focusing on conduct allows future content to flourish but makes sure that platforms adhere to certain guidelines.
  8. Promote User Choices: Policymakers can empower users to move to other platform options or create new platform options by requiring interoperability of platforms. This would reduce barriers to data flows, promoting user choice online as well as a user’s ability to speak legally on alternative platforms.
  9. Establish That Any Section 230 Reforms Meant To Address Alleged Harms Actually Have the Ability To Do So: Some reform proposals seek to revoke Section 230 liability protections for platforms without adequately establishing that doing so addresses the very harm lawmakers are trying to prevent. Lawmakers should address the root of the problem and not merely view every problem as a Section 230 problem.

Principles to Protect Free Expression on the Internet