It's Not Just the Content, It's the Business Model: Democracy’s Online Speech Challenge
This report, the first in a two-part series, articulates the connection between surveillance-based business models and the health of democracy. Drawing from Ranking Digital Rights’s extensive research on corporate policies and digital rights, we examine two overarching types of algorithms, give examples of how these technologies are used both to propagate and prohibit different forms of online speech (including targeted ads), and show how they can cause or catalyze social harm, particularly in the context of the 2020 U.S. election. The report also highlights what we don’t know about these systems, and call on companies to be much more transparent about how they work.
Key recommendations for corporate transparency
- Companies’ rules governing content shared by users must be clearly explained and consistent with established human rights standards for freedom of expression.
- Companies should explain the purpose of their content-shaping algorithms and the variables that influence them so that users and the public can understand the forces that cause certain kinds of content to proliferate, and other kinds to disappear.
- Companies should enable users to decide whether and how algorithms can shape their online experience.
- Companies should be transparent about the ad targeting systems that determine who can pay to influence users.
- Companies should publish their rules governing advertising content (what can be advertised, how it can be displayed, what language and images are not permitted).
- All company rules governing paid and organic user-generated content must be enforced fairly according to a transparent process.
- People whose speech is restricted must have an opportunity to appeal.
- Companies must regularly publish transparency reports with detailed information about the outcomes of the steps taken to enforce the rules.
It's Not Just the Content, It's the Business Model: Democracy’s Online Speech Challenge