Confirmed: Echo chambers exist on social media. So what do we do about them?
Three scholars confirmed what we already knew about social media — or at least had suspected. In a draft paper called “Echo Chambers on Facebook,” social scientists Walter Quattrociocchi, Antonio Scala and Cass Sunstein found quantitative evidence of how users tend to promote their favorite narratives, form polarized groups and resist information that doesn’t conform to their beliefs. The study focused on how Facebook users interacted with two narratives involving conspiracy theories and science. Users belonging to different communities tended not to interact and tended to be connected only with “like-minded” friends, creating closed, non-interacting communities centered around different narratives — what the researchers called “echo chambers.” Confirmation bias accounted for users’ decisions to share certain content, creating informational cascades within their communities. Users tended to seek out information that strengthened their preferred narratives and to reject information that undermined it.
Alarmingly, when deliberately false information was introduced into these echo chambers, it was absorbed and viewed as credible as long as it conformed with the primary narrative. And even when when more truthful information was introduced to correct or “debunk” falsehoods, either it was ignored or it reinforced the users’ false beliefs. While the findings are cause for concern, they don’t come as much of a surprise — confirmation bias is nothing new, and conspiracy theories have become an increasingly visible part of our political discussion. The question is whether there is anything a responsible media can or should do differently to make it easier for facts to penetrate these echo chambers, and whether news organizations are willing to make the necessary changes.
Confirmed: Echo chambers exist on social media. So what do we do about them? Echo Chambers on Facebook (read the research)