Deepa Seetharaman
Facebook to Start Taking Down Posts That Could Lead to Violence
Facebook will start removing misinformation that could spark violence, a response to mounting criticism that the flow of rumors on its platform has led to physical harm to people in countries around the world. Facebook will rely on local organizations of its choosing to decide whether specific posts contain false information and could lead to physical violence, company officials said. If both hold true, the posts will be taken down.
Facebook’s Latest Problem: It Can’t Track Where Much of the Data Went
Facebook's internal probe into potential misuse of user data is hitting fundamental roadblocks: The company can’t track where much of the data went after it left the platform or figure out where it is now.
Facebook Gave Some Companies Special Access to Additional Data About Users’ Friends
Facebook struck customized data-sharing deals with a select group of companies, some of which had special access to user records well after the point in 2015 that the social-media giant has said it cut off all developers from that information, according to court documents. The unreported agreements, known internally as “whitelists,” also allowed certain companies to access additional information about a user’s Facebook friends.
Release of Thousands of Russia-Linked Facebook Ads Shows How Propaganda Sharpened
Democrats on the House Intelligence Committee made public for the first time the full cache of more than 3,000 ads that Facebook said were purchased by a pro-Kremlin group, the Internet Research Agency. The ads, fewer than 50 of which had previously been revealed, offer the clearest window yet into the evolving tactics used by the group as it sought to amplify social and political tensions in the US. The Russian-backed pages initially deployed relatively simple techniques, buying ads targeted to reach large segments, such as all Facebook users living in the US.
Facebook Limiting Information Shared With Data Brokers
Facebook is curbing the information that it exchanges with companies that collect and sell consumer data for advertisers. The measures affect a group of so-called data brokers such as Acxiom Corp. and Oracle Data Cloud, formerly known as DataLogix, that gather shopping and other information on consumers that Facebook for years has incorporated into the ad-targeting system that is at the core of its business.
Facebook to Rank News Sources by Quality to Battle Misinformation
Facebook plans to start ranking news sources in its feed based on user evaluations of credibility, a major step in its effort to fight false and sensationalist information that will also push the company further into a role it has long sought to avoid—content referee. The social-media giant will begin testing the effort next week by prioritizing news reports in its news feed from publications that users have rated in Facebook surveys as trustworthy, executives said Jan 19.
Tech Giants Disclose Russian Activity on Eve of Congressional Appearance
Facebook, Google and Twitter are set to divulge new details showing that the scope of Russian-backed manipulation on their platforms before and after the US presidential election was far greater than previously disclosed, reaching an estimated 126 million people on Facebook alone, according to people familiar with the matter, prepared copies of their testimonies and a company statement. Facebook estimates that 470 Russian-backed accounts connected to a single pro-Kremlin firm, the Internet Research Agency, churned out 80,000 posts on Facebook between January 2015 and August 2017, the social
After Posting of Violent Videos, Facebook Will Add 3,000 Content Monitors
Facebook said it would hire 3,000 more staffers to review content in an attempt to curb violent or sensitive videos on its site without scaling back its live-streaming tools. The planned hires, announced by Chief Executive Mark Zuckerberg, are in response to the Facebook posting of such violent videos as one in April showing a Cleveland (OH) man fatally shooting another man.
Zuckerberg’s proposed fix, which would increase Facebook’s roster of 4,500 reviewers by two-thirds over the next year, addresses the amount of time it takes Facebook to remove graphic content, as opposed to preventing its site from being used to display such content. The Cleveland video was up for roughly two hours; the Thailand video stayed up for 24 hours.
Facebook, Rushing Into Live Video, Wasn’t Ready for Its Dark Side
On orders from Mark Zuckerberg, more than 100 employees at Facebook were put into what the company calls “lockdown” when they showed up for work one Thursday early in 2016. They had been plucked from other projects to focus on the chief executive’s top priority—making it possible for more than a billion Facebook users to stream video live. Zuckerberg had made a snap decision near the end of a product meeting in his glass-walled office in Menlo Park (CA), to work around the clock to roll out Facebook Live, which took just two months. “This is a big shift in how we communicate, and it’s going to create new opportunities for people to come together,” he wrote in a Facebook post during the world-wide launch in April 2016.
At traditional companies, major product launches often take years. Technology firms, and Facebook in particular, emphasize speed even though they know it means there will be problems to iron out later. And there were problems. The live-video rush left unanswered many questions with which Facebook is still wrestling, especially how to decide when violence on camera needs to be censored. According to a tally by The Wall Street Journal, people have used Facebook Live to broadcast at least 50 acts of violence, including murder, suicides and the beating in January of a mentally disabled teenager in Chicago.
Facebook Moves to Curtail Fake News on ‘Trending’ Feature
Facebook is overhauling its “trending topics” box, part of its effort to curb fake news and expose users to a broader range of information. Starting Jan 25, Facebook’s software will surface only topics that have been covered by a significant number of credible publishers, a move designed to cut back on hoaxes by giving more weight to information sources that have been around longer. What’s more, the topics will no longer be personalized to every Facebook user, which could puncture users’ so-called filter bubble and expose them to a variety of different news sources and events.
Facebook has changed its trending feature several times since last spring, after allegations that Facebook contract workers who selected the headlines altered what appeared, for political reasons. In August, Facebook fired the contract workers and opted for a largely software-driven approach. This change, which did away with headlines in favor of hashtags and keywords, quickly led to the appearance of false stories in the box. After the November US presidential election, employees and outsiders criticized Facebook’s laissez-faire attitude toward fake news and its role in creating and enforcing echo chambers, in which like-minded users share and read articles that confirm their beliefs.