The Facebook Files and the Future of Social Media

Benton Institute for Broadband & Society

Friday, October 8, 2021

Weekly Digest

The Facebook Files and the Future of Social Media

 You’re reading the Benton Institute for Broadband & Society’s Weekly Digest, a recap of the biggest (or most overlooked) broadband stories of the week. The digest is delivered via e-mail each Friday.

Round-Up for the Week of October 4-8, 2021

Kevin Taglang
Taglang

We might be tempted to remember this as Mark and the terrible, horrible, no good, very bad week. A series of damaging articles in the Wall Street Journal, a whistleblower testifying before Congress, and a massive outage of the platform. But Facebook's problems date back much farther than this week. The ramifications could last long into the future—and impact much more than the social media giant.

The Facebook Files

On September 13, the Wall Street Journal began publishing a series of articles it dubbed The Facebook Files, exposing flaws in the company's platforms, flaws the company knows about but has done little to address.

XCheck

The first article looked at a Facebook program known as “cross check” or “XCheck” which exempts high-profile users from some or all of its rules. Initially intended as a quality-control measure for actions taken against high-profile accounts (including celebrities, politicians and journalists), today, XCheck shields millions of VIP users from the company’s normal enforcement process. XCheck has protected public figures whose posts contain harassment or incitement to violence, violations that would typically lead to sanctions for regular users. The program covers pretty much anyone regularly in the media or who has a substantial online following  (think film stars, cable talk-show hosts, academics, and online personalities with large followings).

In practice, most of the content flagged by the XCheck system faces no subsequent review. Even when the company does review the material, enforcement delays mean content that should have been prohibited can spread to large audiences. In 2020 alone, XCheck allowed posts that violated its rules to be viewed at least 16.4 billion times, before later being removed.

A Toxic Environment for Teen Girls

On September 14, Georgia Wells, Jeff Horwitz, and Deepa Seetharaman reported that Facebook has been conducting studies into how Instagram, its photo-sharing app, affects its millions of young users. Repeatedly, the company’s researchers found that Instagram is harmful for a sizable percentage of them, most notably teenage girls:

Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse. Comparisons on Instagram can change how young women view and describe themselves.

We make body image issues worse for one in three teen girls.

Teens blame Instagram for increases in the rate of anxiety and depression. This reaction was unprompted and consistent across all groups.

Among teens who reported suicidal thoughts, 13% of British users and 6% of American users traced the desire to kill themselves to Instagram.

When told of Facebook’s internal research, Jean Twenge, a professor of psychology at San Diego State University who has published research finding that social media is harmful for some kids, said it was a potential turning point in the discussion about how social media affects teens. “If you believe that R.J. Reynolds should have been more truthful about the link between smoking and lung cancer, then you should probably believe that Facebook should be more upfront about links to depression among teen girls,” she said.

Facebook Targets Preteens

“Why do we care about tweens? They are a valuable but untapped audience.”

Facebook has made what it called “big bets” on designing products that would appeal to preteens across its services. Teams of Facebook employees have for years been laying plans to attract preteens that go beyond what is publicly known, Wells and Horwitz reported on September 28, spurred by fear that Facebook could lose a new generation of users critical to its future. Facebook has tried to understand which products might resonate with children and “tweens” (ages 10 through 12), how these young people view competitors’ apps, and what concerns their parents.

Federal privacy law forbids data collection on children under 13, and lawmakers have criticized tech companies for not doing more to protect kids online from predators and harmful content.

Rewarding Outrage

In 2018 Facebook changed its algorithm affecting its central feature, the News Feed, a constantly updated, customized scroll of friends’ posts and links to news stories. The News Feed accounts for the majority of time Facebook’s nearly three billion users spend on the platform. The company sells that user attention to advertisers, both on Facebook and its sister platform Instagram, accounting for nearly all of its $86 billion in revenue in 2020.

But Facebook researchers discovered that publishers and political parties were reorienting their posts toward outrage and sensationalism. That tactic produced high levels of comments and reactions that translated into success on Facebook. The researchers concluded that the new algorithm’s heavy weighting of reshared material in Facebook's News Feed made the angry voices louder. “Misinformation, toxicity, and violent content are inordinately prevalent among reshares,” researchers noted.

"Our approach has had unhealthy side effects on important slices of public content, such as politics and news."

Data scientists on an integrity team—whose job is to improve the quality and trustworthiness of content on the platform—worked on a number of potential changes to curb the tendency of the overhauled algorithm to reward outrage and lies. Facebook Founder Mark Zuckerberg resisted some of the proposed fixes because he was worried they might hurt the company’s main objective—making users engage more with Facebook.

Inadequate or No Response to Illegal Activities

Facebook employees have raised alarms about how its platforms are used in some developing countries. Recruiting, training and paying hit men in Mexico. Human trafficking in the Middle East. Inciting violence against ethnic minorities in Ethiopia. Organ selling. Pornography. Quelling political dissent.

The company’s response in many instances is inadequate or nothing at all.

When problems have surfaced publicly, Facebook has said it addressed them by taking down offending posts. But it hasn’t fixed the systems that allowed offenders to repeat the bad behavior. Instead, priority is given to retaining users, helping business partners, and at times placating authoritarian governments, whose support Facebook sometimes needs to operate within their borders.

Facebook treats harm in developing countries as “simply the cost of doing business” in those places, said Brian Boland, a former Facebook vice president who oversaw partnerships with internet providers in Africa and Asia before resigning at the end of 2020. Facebook has focused its safety efforts on wealthier markets with powerful governments and media institutions, he said, even as it has turned to poorer countries for user growth.

Sowing Doubt in Covid-19 Vaccines

Although Mark Zuckerberg set an ambitious goal to use Facebook's formidable resources to push 50 million people toward Covid-19 vaccines, comments on the platform were filled with anti-vaccine rhetoric. The comments ranged from personal objections all the way to debunked falsehoods and conspiracy theories. Even authoritative sources of vaccine information were becoming "cesspools of anti-vaccine comments." Anti-vaccine activists flooded the network with what Facebook calls “barrier to vaccination” content.

The wave of negative comments worried global health institutions, including the World Health Organization and UNICEF. President Biden said the falsehoods were “killing people.”

In August 2020, a report by advocacy group Avaaz concluded that the top 10 producers of what the group called “health misinformation” were garnering almost four times as many estimated views on Facebook as the top 10 sources of authoritative information. Facebook needed to take harsher measures to beat back “prolific” networks of Covid misinformation purveyors, Avaaz warned. However, Mark Zuckerberg wasn’t ready to embrace a more interventionist approach against its users. While he disagreed with anti-vaccine activists, his company was committed to removing only content that health officials said posed an imminent threat.

Whistleblower Visits the Senate

Getty ImagesThis week, the Senate Commerce Committee's Subcommittee on Consumer Protection, Product Safety, and Data Security convened a hearing to hear from former Facebook employee Frances Haugen, who has provided the Wall Street Journal with internal documents used in The Facebook Files series. 

In prepared testimony, Haugen said:

During my time at Facebook, first working as the lead product manager for Civic Misinformation and later on Counter-Espionage, I saw that Facebook repeatedly encountered conflicts between its own profits and our safety. Facebook consistently resolved those conflicts in favor of its own profits. The result has been a system that amplifies division, extremism, and polarization—and undermining societies around the world. In some cases, this dangerous online talk has led to actual violence that harms and even kills people. In other cases, their profit optimizing machine is generating self-harm and self-hate—especially for vulnerable groups, like teenage girls. These problems have been confirmed repeatedly by Facebook’s own internal research. This is not simply a matter of some social media users being angry or unstable. Facebook became a $1 trillion company by paying for its profits with our safety, including the safety of our children. And that is unacceptable.

Haugen went on to say that as long as Facebook is operating in the dark, it is accountable to no one. And it will continue to make choices that go against the common good. She suggested reforming Section 230 of the Communications Decency Act to strip social media companies of the right not to be sued over decisions they make on how algorithms promote certain content. “[Platforms] have 100 percent control over their algorithms,” she said. “Facebook should not get a free pass on choices it makes to prioritize growth and virality and reactiveness over public safety.”

SEC Complaints

Haugen has also filed complaints with the Securities and Exchange Commission against Facebook. Her lawyers state that Haugen is "disclosing original evidence showing that Facebook ... has, for years past and ongoing, violated U.S. securities laws by making material misrepresentations and omissions in statements to investors and prospective investors, including, inter alia, through filings with the SEC, testimony to Congress, online statements and media stories."

Among the allegations in the SEC filings are claims that Facebook and Instagram were aware in 2019 that the platforms were being used to "promote human trafficking and domestic servitude." The filings also allege Facebook "failed to deploy internally-recommended or lasting counter-measures" to combat misinformation and violent extremism related to the 2020 election and January 6 insurrection.

What's Next?

The coverage we found this week noted that Haugen’s appearance stood out not only for the inside look but for the way she united Republican and Democratic lawmakers around tackling the issue of the platform’s harm to teenagers. Some senators called her testimony a “Big Tobacco” moment for the technology industry. The lawmakers said Haugen’s testimony, and the thousands of pages of documents she had gathered from the company and then leaked, showed that Facebook’s top executives had misled the public and could not be trusted.

Republican and Democratic lawmakers at the hearing renewed their calls for regulation, such as strengthening privacy and competition laws and special online protections for children, as well as toughening of the platforms’ accountability. One idea that got a particular boost was requiring more visibility into social-media data as well as the algorithms that shape users’ experiences. 

“I think the time has come for action, and I think you are the catalyst for that action,” Sen. Amy Klobuchar (D-MN) told Haugen during the hearing. 

“I would simply say, let’s get to work,” said Sen John Thune (R-SD), who has sponsored several measures on algorithm transparency. “We’ve got some things we can do here.”

There seems to be consensus that something needs to be done. But is there consensus on what needs to be done?

Sen. Klobuchar cosponsored the Safeguarding Against Fraud, Exploitation, Threats, Extremism and Consumer Harms (SAFE TECH) Act to reform Section 230 and allow social media companies to be held accountable for enabling cyber-stalking, targeted harassment, and discrimination on their platforms. The SAFE TECH Act would make clear that Section 230:

  • Doesn’t apply to ads or other paid content—ensuring that platforms cannot continue to profit as their services are used to target vulnerable consumers with ads enabling frauds and scams;
  • Doesn’t bar injunctive relief—allowing victims to seek court orders where misuse of a provider’s services is likely to cause irreparable harm; 
  • Doesn’t impair enforcement of civil rights laws—maintaining the vital and hard-fought protections from discrimination even when activities or services are mediated by internet platforms; 
  • Doesn’t interfere with laws that address stalking/cyberstalking or harassment and intimidation on the basis of protected classes—ensuring that victims of abuse and targeted harassment can hold platforms accountable when they directly enable harmful activity;
  • Doesn’t bar wrongful death actions—allowing the family of a decedent to bring suit against platforms where they may have directly contributed to a loss of life;
  • Doesn’t bar suits under the Alien Tort Claims Act—potentially allowing victims of platform-enabled human rights violations abroad (like the survivors of the Rohingya genocide) to seek redress in U.S. courts against U.S.-based platforms.

Sen. Klobuchar also introduced the Health Misinformation Act which would create an exception to Section 230 of the Communications Decency Act’s liability shield for platforms with algorithms that promote health-related misinformation related to an existing public health emergency.

Thune's approach has been through the introduction of the Platform Accountability and Consumer Transparency Act, or PACT Act. It would require internet platforms like Facebook to make quarterly reports to the public outlining material they’ve removed from their sites or chosen to deemphasize. Sites would also be required to provide an easily digestible disclosure of their content moderation practices for users. And they would be required to explain their decisions to remove material to consumers.

In the House, the Judiciary Committee in 2019 sought to: (1) document competition problems in digital markets; (2) examine whether dominant firms are engaging in anticompetitive conduct; and (3) assess whether existing antitrust laws, competition policies, and current enforcement levels are adequate to address these issues.

The investigation concluded that Facebook has monopoly power in the market for social networking and that monopoly power is firmly entrenched and unlikely to be eroded by competitive pressure from new entrants or existing firms. In the absence of competition, the committee found, Facebook’s quality has deteriorated over time, resulting in worse privacy protections for its users and a dramatic rise in misinformation on its platform.

Part of the $3.5 trillion economic package now before Congress is $1 billion for the Federal Trade Commission to create a new digital-focused division that would police privacy violations, cybersecurity incidents, and other online abuses.

House Judiciary Republicans have their own agenda. Their plan is to make it easier to sue Big Tech companies, hold them accountable for censorship, and make platform moderation decisions more transparent.

Outside of Congress, there have been calls to create a new federal agency designed to deal with digital issues. Last summer, Tom Wheeler, Phil Verveer, and Gene Kimmelman suggested a digital platform agency be built around three concepts:

  1. Oversight of digital platform market activity on the basis of risk management rather than micromanagement; this means targeted remedies focused on market outcomes and thereby avoids rigid utility-style regulation,
  2. Restoration of common law principles of a duty of care and a duty to deal as the underpinning of the agency's authority, and
  3. Delivery of these results via an agency that works with the platform companies to develop enforceable behavioral codes while retaining the authority to act independently should that become necessary. 

Public Knowledge Senior Vice President Harold Feld wrote The Case for the Digital Platform Act: Market Structure and Regulation of Digital Platforms, a book that provides a framework for the ongoing debate on the regulation of digital platforms.

In addition to the U.S, there have been intensified calls in Europe for new regulations aimed at Facebook and other Silicon Valley giants, proposals considered by many to be among the most stringent and far-reaching in the world.

  • One of the proposals, the Digital Services Act, includes transparency requirements that Ms. Haugen called for during her testimony, requiring Facebook and other large tech platforms to disclose details to regulators and outside researchers about their services, algorithms and content moderation practices. The draft law could also force Facebook and other tech giants to conduct annual risk assessments in areas such as the spread of misinformation and hateful content.
  • Another E.U. proposal, the Digital Markets Act, would put new competition regulation in place for the biggest tech platforms, including restricting their ability to use their dominance with one product to gain an edge on rivals in another product category.

A Continuing Problem

The harms of Facebook have been well-documented. In fact, my colleague Robbie McBeath wrote a three-part series looking at Facebook's influence on our democratic discourse and elections. The series is now three years old, but many of the problems remain or have gotten worse. And solutions do not seem any closer to being realized. 

The (digital) beat goes on.

Quick Bits

Weekend Reads (resist tl;dr)

ICYMI from Benton

Upcoming Events

Oct 12—Building on Broadband: Inspiring Progress (Blandin Foundation)

Oct 14—Task Force for Reviewing the Connectivity and Technology Needs of Precision Agriculture in the United States (FCC)

Oct 18—Advancing Digital Inclusion Services in Native Communities (Association of Tribal Archives, Libraries and Museums)

Oct 19—39th Annual Parker Lecture & Awards Ceremony (United Church of Christ)

Oct 20—Connecting Minority Communities Pilot Program Webinar (NTIA)

Oct 21—Connecting Minority Communities Pilot Program Webinar (NTIA)

Oct 21—Internet Policy Wars 3.0 / Is The Past A Prologue To The Fight For Web3? (Congressional Internet Caucus Academy)

Oct 22—The Decade of Digital Inclusion (Marconi Society)

Oct 26—October 2021 Open FCC Meeting

 

The Benton Institute for Broadband & Society is a non-profit organization dedicated to ensuring that all people in the U.S. have access to competitive, High-Performance Broadband regardless of where they live or who they are. We believe communication policy - rooted in the values of access, equity, and diversity - has the power to deliver new opportunities and strengthen communities.


© Benton Institute for Broadband & Society 2021. Redistribution of this email publication - both internally and externally - is encouraged if it includes this copyright statement.


For subscribe/unsubscribe info, please email headlinesATbentonDOTorg

Kevin Taglang

Kevin Taglang
Executive Editor, Communications-related Headlines
Benton Institute
for Broadband & Society
1041 Ridge Rd, Unit 214
Wilmette, IL 60091
847-328-3040
headlines AT benton DOT org

Share this edition:

Benton Institute for Broadband & Society Benton Institute for Broadband & Society Benton Institute for Broadband & Society

Benton Institute for Broadband & Society

Broadband Delivers Opportunities and Strengthens Communities


By Kevin Taglang.