How Facebook Avoids Accountability

Benton Foundation

Wednesday, November 21, 2018

Weekly Digest

How Facebook Avoids Accountability

 You’re reading the Benton Foundation’s Weekly Round-up, a recap of the biggest (or most overlooked) telecommunications stories of the week. The round-up is delivered via e-mail each Friday.

Round-Up for the Week of November 19-23, 2018 

Robbie McBeath
McBeath

On November 14, 2018, the New York Times detailed Facebook’s multi-pronged campaign to “delay, deny and deflect” efforts to hold the company accountable. This is far from the first time we’ve read disturbing accounts of Facebook’s unethical behavior [See: Is Facebook a 'Bug' in Our Democracy? Part I; Part II; Part III], but this week the Times peeled back the curtain on the company’s crisis management techniques, public relations tactics, efforts to influence lawmakers, and aggressive lobbying. The peak at these practices helps explain why the social media giant has been so successful at avoiding meaningful regulation.

Delay, Deny and Deflect

Facebook has been experiencing a series of crisis for years, including Russian manipulation during the 2016 elections, the Cambridge Analytica scandal, massive privacy violations, and spreading propaganda that has incited genocide. The Times’ story notes that, when faced with crisis, Facebook executives including CEO Mark Zuckerberg and COO Sheryl Sandberg delay attempts at seeking more information or implementing effective solutions, deny harm or wrongdoing, and ultimately deflect criticism and regulation and move forward with minimal concrete changes.


Sandberg Testifying in September 2018

The pattern has been noted before. Professor Nicholas Proferes tracked the many apologies Zuckerberg has given. “The pattern I’ve observed goes like this: Acknowledge, diffuse blame, make the problem manageable, empower users, invoke personal care.”

Minimizing Russia’s Role in the 2016 Elections 

The New York Times’ story indicates Facebook ignored and downplayed the extent of Russian skulduggery. Examining Facebook's crisis management as the incident came to light reveals a lot about the company.  

In the Spring of 2016, a Facebook expert on Russian cyberwarfare looked into Russian activity on the platform and discovered that Russian hackers appeared to be probing the Facebook accounts of people connected to the presidential campaigns. He alerted Facebook's general counsel. Months later, as Donald Trump battled Hillary Clinton in the general election, the team also found Facebook accounts linked to Russian hackers who were messaging journalists to share information from emails stolen from Clinton and other key Democrats.

After the election, in December 2016, Zuckerberg and Sandberg internally discussed Russian activity on the platform as Zuckerberg publicly scoffed at the idea that fake news on Facebook had helped elect Donald Trump. Sandberg was angry at the employee who brought the issue to light, believing he had left the company exposed legally. 

Nonetheless, Sandberg and Zuckerberg decided to expand on the inquiry into Russian influence. But other Facebook executives -- including Joel Kaplan, the vice president for global public policy -- objected, fearing that Republicans would accuse the company of siding with Democrats.

Kaplan was hired in 2011 as part of Facebook's effort to strengthen the company's ties to Republican lawmakers on Capitol Hill. He previously served as Deputy Chief of Staff for Policy under President George W. Bush. Kaplan prevailed on Sandberg to also promote Kevin Martin, a former Federal Communications Commission chairman and fellow Bush administration veteran, to lead the company’s American lobbying efforts. 

A New York Times editorial notes, “Facebook could have approached its civic duty head-on, but instead busied itself with damage control. Joel Kaplan...objected to the public dissemination of internal findings on the grounds that it would offend conservatives.”

Sandberg sided with Kaplan, though eventually Facebook produced a paper in April 2017 about “information operations” on the platform. The word “Russia” never appeared. Facebook spent the next months downplaying reports, while inside the company, employees were tracing more ads, pages, and groups back to Russia. 

In September 2017, after pressure from the company’s board, Facebook issued an “update” to its April white paper, disclosing information on Russian agents spending ad money on the platform, but ultimately did not disclose more damaging details. The Times notes:

The combined revelations infuriated Democrats, finally fracturing the political consensus that had protected Facebook and other big tech companies from Beltway interference. Republicans, already concerned that the platform was censoring conservative views, accused Facebook of fueling what they claimed were meritless conspiracy charges against Mr. Trump and Russia. Democrats, long allied with Silicon Valley on issues including immigration and gay rights, now blamed Mr. Trump’s win partly on Facebook’s tolerance for fraud and disinformation.

Facebook’s minimization of the role of Russian meddling in the 2016 election had run its course -- now, both sides of the political aisle had concerns about the company’s practices. 

“Facebook girded for battle,” the Times wrote.

Facebook's Political Influence

The next month, in October 2017, Senators Amy Klobuchar (D-MN) and Mark Warner (D-VA) unveiled the Honest Ads Act (S. 1989) to help prevent foreign interference in future elections and improve the transparency of online political advertisements. “It’s time for Facebook to let all of us see the ads bought by Russians *and paid for in Rubles* during the last election,” Sen. Klobuchar wrote on her own Facebook page.

 

But days after the bill was unveiled, Facebook hired Sen. Warner’s former chief of staff, Luke Albee, to lobby on the bill. Kaplan’s team also took a larger role in managing the company’s Washington response, routinely reviewing Facebook news releases for words or phrases that might rile conservatives.

At the same time, Sandberg reached out to Sen. Klobuchar. The two had a  friendly relationship: the senator is featured on the website for Lean In, Sandberg’s empowerment initiative, and Sandberg had contributed a blurb to Sen. Klobuchar’s 2015 memoir. Sen. Klobuchar’s chief of staff had previously worked at Sandberg’s charitable foundation.

Facebook also sought help from Senate Minority Leader Chuck Schumer (D-NY), who has long worked to advance Silicon Valley’s interests on issues such as commercial drone regulations and patent reform. During the 2016 election cycle, he raised more money from Facebook employees than any other member of Congress, according to the Center for Responsive Politics. And Sen. Schumer has a personal connection to Facebook: His daughter Alison joined the firm out of college and is now a marketing manager in Facebook’s New York office. Sen. Schumer confronted Sen. Warner, by then Facebook’s most insistent inquisitor in Congress:

Back off, he told Mr. Warner, according to a Facebook employee briefed on Mr. Schumer’s intervention. Mr. Warner should be looking for ways to work with Facebook, Mr. Schumer advised, not harm it. Facebook lobbyists were kept abreast of Mr. Schumer’s efforts to protect the company, according to the employee. A Senate aide briefed on the exchange said that Mr. Schumer had not wanted Mr. Warner to lose sight of the need for Facebook to tackle problems with right-wing disinformation and election interference, as well as consumer privacy and other issues.

The effects of this kind of lobbying can be hard to trace, but there are signs that Facebook was very successful. For example, Sandberg testified before Capitol Hill in September 2018 [See: A Platform for Political Theater]. The Times wrote:

Facebook lobbyists had already worked the Intelligence Committee hard, asking that lawmakers refrain from questioning Ms. Sandberg about privacy issues, Cambridge Analytica and censorship. The argument was persuasive with [Senator] Burr, who was determined to avoid a circuslike atmosphere. A day before the hearing, he issued a stern warning to all committee members to stick to the topic of election interference....Facebook had lobbied for the hearing to include a Google emissary of similar rank to Ms. Sandberg. The company won a partial victory when Mr. Burr announced that Larry Page, a Google co-founder, had been invited, along with Jack Dorsey, Twitter’s chief executive....As the hearing unfolded, senators excoriated Google for its absence, earning a wave of negative news coverage for Facebook’s rival. 

Facebook’s Own Disinformation Campaign

In October 2017, the same month the Honest Ads Act was introduced, Facebook expanded its work with a Washington-based consultant, Definers Public Affairs, that had originally been hired to monitor press coverage of the company. 

Founded by veterans of Republican presidential politics, Definers specialized in applying political campaign tactics to corporate public relations — an approach long employed in Washington by big telecommunications firms and activist hedge fund managers.

Facebook used Definers to distribute research documents to reporters that cast the liberal donor George Soros as an unacknowledged force behind activists protesting Facebook, and helped publish articles criticizing Facebook’s rivals on what was designed to look like a typical conservative news site. 

“Individuals that promote anti-Semitic bile, like Definers, and the people at Facebook who hired them, threaten not just our safety, but our democracy,” Sen. Ron Wyden (D-OR) tweeted. “Facebook has not only refused to effectively crack down on hate-spewing Nazis,...it actually encouraged anti-Semitism by hiring degenerate right-wing propagandists to concoct conspiracies that tap into anti-Semitic biases.” 

Color of Change, a progressive nonprofit civil rights advocacy organization of which Geoge Soros is a donor, received death threats. “No, we didn’t know about Definers prior to the New York Times report,” said Color of Change's president, Rashad Robinson. Discussing the increase in threats, Robinson said, “It makes so much sense now, and it also shows me how effective and how much the pushing we have been doing and challenging and calling out Facebook and running campaigns and sitting at the table with them had gotten under Facebook’s skin.”

Laura Silber, a spokeswoman for Soros’ Open Society Foundations, said the philanthropy’s president spoke with Sandberg and requested that Facebook commission an independent review of the company’s relationship with Definers, with results made public within three months.

Definers undertook more covert efforts to spread the blame for the rise of the Russian disinformation, pointing fingers at other companies like Google. A key part of Definers’ strategy was NTK Network, a website that appeared to be a run-of-the-mill news aggregator with a right-wing slant. In fact, many of NTK Network’s stories were written by employees at Definers and America Rising, a sister firm, to criticize rivals of their clients. While the NTK Network does not have a large audience of its own, its content is frequently picked up by popular conservative outlets, including Breitbart. 

“In other words, Facebook employed a political P.R. firm that circulated exactly the kind of pseudo-news that Facebook has, in its announcements, sought to prevent from eroding Americans’ confidence in fact versus fiction,” wrote Evan Osnos. 

On the evening of November 14, the same day the Times’ article dropped, Facebook said it cut ties with Definers. 

“I understand that a lot of D.C.-type firms might do this kind of work. When I learned about it I decided that we don’t want to be doing it,” Zuckerberg said the next day. “In general, we need to go through all of our relationships and evaluate what might be more typical D.C. relationships and decide if we want to continue with them.”

On November 15, Sandberg wrote, “I did not know we hired them or about the work they were doing, but I should have. I have great respect for George Soros — and the anti-Semitic conspiracy theories against him are abhorrent.”

Reactions from Congress

The Times’ article sparked a wave of criticism from Congress. 

“It is increasingly clear that Facebook is more concerned about covering their own hide than being transparent with Congress and its users,” tweeted Rep. Frank Pallone (D-NJ) who is set to chair the House Commerce Committee.

Senator Richard Blumenthal (D-CT) pledged that forthcoming legislation would “hold all these companies accountable” and said he wants “more transparency, more forthrightness, less manipulating behind the scenes” from Facebook in the meantime. 

“[W]e learned that when Mark Zuckerberg told the American people that Russian interference was a ‘pretty crazy idea,’ he knew this was flatly untrue,” he said. “Rather than take responsibility for a profound breach of trust, Facebook executives for months sought to withhold significant information and deflect criticism. Worse, in its evasion, Facebook hired toxic political operators that sought to mislead the public and disparage critics of the company.”

Sen. Klobuchar led three other Democratic Senators in sending a letter Nov. 15 to the Department of Justice urging Deputy Attorney General Rod Rosenstein to expand any investigation into Facebook and Cambridge Analytica to include whether Facebook—or any other entity affiliated with or hired by Facebook—hid information and retaliated against critics or public officials seeking to regulate the platform. 

Since the 2016 election, both the government and Facebook’s own internal investigations have revealed that the company failed to adequately protect the data and trust of its 2.2 billion users. Facebook also failed to implement basic protocols to prevent manipulation by foreign adversaries working to undermine America’s political system.

Given the staggering amount of data that Facebook has collected on both its users – even people who have not consented to use of the platform – these allegations raise profound concerns about the company’s willingness to protect the public and our democracy.

“Facebook cannot be trusted to regulate itself,” said Rep. David Cicilline (D-RI), likely to be the chair the House Judiciary Subcommittee on Antitrust.  “This staggering report makes clear that Facebook executives will always put their massive profits ahead of the interests of their customers.”

Oversight and...Accountability?

Facebook’s problems are baked into the company’s business model. This alone makes self-regulation....problematic at best. 

The New York Times’ editorial board noted

Russian influence operations and viral false reports should have been anticipated byproducts of Facebook’s business model, which is based on selling advertising on the back of user engagement. In short, Facebook capitalizes on personal information to influence the behavior of its users, and then sells that influence to advertisers for a profit. It is an ecosystem ripe for manipulation.

Will Facebook, as a company, actually recognize this? Probably not. Will Zuckerberg and Sandberg continue to be in leadership positions? Probably so. 

Osnos writes:

[T]he Times story will revive questions about whether Zuckerberg and Sandberg should remain at the head of the company. But nobody involved with Facebook thinks they are at obvious risk of losing their jobs, because they maintain the support of a board of directors that some observers believe has been far too passive in the face of Facebook’s stumbles.

And, you know, Zuckerberg is the chairman. Facebook has “dual class” share structure for how it issues stocks, essentially designed as a bulwark against management change. It is estimated Zuckerberg alone controls about 60% of all voting shares. 

“We don’t think it’s a question of whether regulation. We think it’s a question of the right regulation.” -- Sheryl Sandberg, testifying before Congress in September 2018 

But, more and more members of Congress are starting to publicly recognize that maybe Facebook cannot be left to self-regulate. Senator Warner said, “It’s important for Facebook to recognize that this isn’t a public relations problem – it’s a fundamental challenge for the platform and their business model. I think it took them too long to realize that. It’s clear that Congress can’t simply trust [Facebook] to address these issues on their own."

Yes, perhaps it is time for Congress to step in. But will they do enough? The public is skeptical. Axios reported on a recent poll that found, "In the past year, there has been a 15-point spike in the number of people who fear the federal government won’t do enough to regulate big tech companies — with 55% now sharing this concern." There was also a 14-point increase in those who feel technology has hurt democracy and free speech. 

But, at least there may be more hearings. The Times' editorial continued, "the incoming House, newly in Democratic hands, should make serious oversight a priority. If the House is looking to set the agenda for the next two years, Facebook should be near the top. What ambiguities remain about what Facebook knew and when are prime subjects for hearings.”

There may even be some bipartisan support for such oversight. Senator Ben Sasse (R-NE) expressed the seriousness of the issue:

Instead of turning this into another lazy debate about the left, the right, and the 2016 election, Silicon Valley and Washington should be working to combat the very real threat that information operations can pour gasoline on nearly every culture war that divides the American people. Facebook needs to stop treating this like a PR crisis and Washington needs to stop treating this like a partisan opportunity — this is a real national security threat.

Conclusion

Facebook has not been forthcoming about the scope and impact of Russian meddling with the 2016 election. The company uses political influence to block legislation and uses disinformation campaigns to attack critics and deflect blame.

The pattern of crisis management indicates Facebook will not address its issues in full on its own. 

Congressional oversight is a possibility, though advocates representing the public are certainly outmatched by Facebook’s lobbying army. 

Reining in the power of a company that has been negatively polluting our democratic discourse does not look easy. 

“While Facebook continues to try to minimize the fallout from the latest revelations, we need to begin a deeper examination of an intrinsic bug burrowed into our social media platforms,” I wrote back in April. If anything, this is only more true today.

Facebook’s business model is inherently at odds with the public interest and a healthy democratic discourse. And now we know some of the ugly details of how the company has avoided accountability. Short-term policy solutions can help curb some of Facebook’s harmful effects, but the larger task before policymakers -- and all of us -- is to critically examine the long-term health of our democratic discourse.

Happy Thanksgiving, though! The Weekly will return next week. 
 


Quick Bits

Weekend Reads (resist tl;dr)

ICYMI from Benton

November 2018 Events 

Nov 27 -- New Debates and Tensions in Antitrust: What Does the Future Hold?, Georgetown Center for Business and Public Policy

Nov 27 -- How Encryption Saves Lives and Fuels our Economy, New America

Nov 27 -- Oversight of the Federal Trade Commission, Senate Commerce Committee

Nov 30 -- Forum on AI and Machine Learning, FCC

Benton, a non-profit, operating foundation, believes that communication policy - rooted in the values of access, equity, and diversity - has the power to deliver new opportunities and strengthen communities to bridge our divides. Our goal is to bring open, affordable, high-capacity broadband to all people in the U.S. to ensure a thriving democracy.


© Benton Foundation 2018. Redistribution of this email publication - both internally and externally - is encouraged if it includes this copyright statement.


For subscribe/unsubscribe info, please email headlinesATbentonDOTorg

Kevin Taglang

Kevin Taglang
Executive Editor, Communications-related Headlines
Benton Foundation
727 Chicago Avenue
Evanston, IL 60202
847-328-3049
headlines AT benton DOT org

Share this edition:

Benton Foundation Benton Foundation Benton Foundation

Benton Foundation

PUBLIC INTEREST VOICES FOR THE DIGITAL AGE


By Robbie McBeath.