Facebook Use Polarizing? Site Begs to Differ
For years, political scientists and social theorists have fretted about the Internet’s potential to flatten and polarize democratic discourse. Because so much information now comes through digital engines shaped by our preferences -- Facebook, Google and others suggest content based on what consumers previously enjoyed -- scholars have theorized that people are building an online echo chamber of their own views. But in a peer-reviewed study published in the journal Science, data scientists at Facebook report the echo chamber is not as insular as many might fear -- at least not on the social network. While independent researchers said the study was important for its scope and size, they noted several significant limitations.
After analyzing how 10.1 million of the most partisan American users on Facebook navigated the site over a six-month period in 2014, researchers found that people’s networks of friends and the stories they see are in fact skewed toward their ideological preferences. But that effect is more limited than the worst case some theorists had predicted, in which people would see almost no information from the other side. Facebook’s findings run counter to a longstanding worry about the potential for digital filtering systems to shape our world. For Facebook, the focus is on the algorithm that the company uses to decide which posts people see, and which they do not, in its News Feed. The study also raised -- but did not answer -- the question of what happens after people click on an article with an opposing view: Are they being persuaded by its arguments, or are they dismissing it out of hand? A click, in other words, is not necessarily an endorsement, or even a sign of an open mind.
Facebook Use Polarizing? Site Begs to Differ