When Limiting Online Speech to Curb Violence, We Should Be Careful

Source: 
Author: 
Coverage Type: 

Two key strategies have emerged to hold online forums responsible for violence: deplatforming and increasing the liability imposed on internet intermediaries by changing Section 230 of the Communications Decency Act (CDA). Both strategies are notable because they are not directly aimed at the perpetrators of violence, or even at others who are participating in the hateful forums. They are instead aimed at the chain of companies or nonprofits that host the speech of others. For either approach, there is reason to tread carefully. Both strategies have surface appeal in response to hateful speech. But once you’ve turned it on, whether through pressure or threats of lawsuits, the power to silence people doesn’t just go in one direction. In our 30 years of helping people make their voices heard online at the Electronic Frontier Foundation, we have seen how censorship reinforces power far more than it protects the powerless.

In response to ongoing issues with the major hosts of user-generated content, EFF helped write and promulgate the Santa Clara Principles in May 1998 to try to outline some basic transparency and due process standards that those companies should implement when they directly host user-generated content. Deplatforming and eliminating Section 230 both satisfy a craving to do something, to hold someone or something responsible. But make no mistake: Both carry great risks if we want the internet to remain a place where powerful entities cannot easily silence their less powerful critics.

[Cindy Cohn is the executive director of the Electronic Frontier Foundation]


When Limiting Online Speech to Curb Violence, We Should Be Careful