Much has been written about how ideologically slanted news outlets like Fox News and MSNBC have contributed to the polarization of American politics. There is indeed some evidence to suggest newscasts and talk shows that continually reinforce one’s pre-existing opinions will produce citizens who are more close-minded and less willing to compromise.

Now, newly published research suggests an unexpected countervailing source has emerged: Facebook.

Increasingly, people are getting news and information from the social network’s news feed, which includes stories that a friend has deemed important enough to share, ranked by a sophisticated algorithm that places those you are most likely to find interesting at the top of your queue.

Advertisement X

One would think this could result in the creation of little echo chambers, in which users only click on ideologically friendly stories, in the process bumping similar pieces up the chain where they’re more likely to be noticed. But at least according to Facebook’s own researchers, this is not happening.

“On average, more than 20 percent of an individual’s friends who report an ideological affiliation are from the opposing party, leaving substantial room for exposure to opposing viewpoints.”

Writing in the journal Science, Facebook’s Eytan Bakshy and the University of Michigan’s Solomon Messing and Lada Adamic describe a large-scale study that looks at how people use the popular social network. More specifically, they analyzed what political content these users were exposed to, and what they chose to click on and actually read.

“Rather than people browsing only ideologically aligned news sources, or opting out of hard news altogether, our work shows that social media exposes individuals to at least some ideologically cross-cutting viewpoints,” the researchers write.

They note that, even among strong political partisans, many online friendships “cut across ideological affiliations,” and the links they share offer an opportunity to check out alternative ways of thinking.

Whether people take advantage of that opportunity is, of course, a totally different issue.

The study examined Facebook usage over a six-month period between July 7, 2014, and January 7, 2015, focusing on 10.1 million Americans 18 years or older, all of whom logged in at least four days per week, and reported their political ideology.

The researchers examined three factors that influence exposure to political information: News and opinion items that were shared by the users’ friends; how those stories were ranked by the Facebook algorithm; and whether the users clicked on pieces that challenged their ideological preconceptions.

Not surprisingly, they found “substantial polarization among hard (news) content shared by users, with the most frequently shared links clearly aligned with largely liberal or conservative populations.”

“Liberals tend to be connected to fewer friends who share information from the other side, compared to their conservative counterparts,” the researchers write. “Twenty-four percent of the hard content shared by liberals friends (reflects the opposing viewpoint), compared to 35 percent for conservatives.”

That said, the researchers also found that “on average, more than 20 percent of an individual’s friends who report an ideological affiliation are from the opposing party, leaving substantial room for exposure to opposing viewpoints.”

This finding was confirmed by another new study, which looked at social media discussions of climate change. “Most users interact only with like-minded others, in communities dominated by a single view,” writes a research team led by Hywel Williams of the University of Exeter. “However, we also find mixed-attitude communities in which skeptics and activists frequently interact.”

“Liberals tend to be connected to fewer friends who share information from the other side, compared to their conservative counterparts.”

Bakshy and his colleagues found that Facebook’s algorithmic ranking system (which favors pieces similar to those a user has previously clicked on) produces “somewhat less cross-cutting content”: About eight percent less than liberals would otherwise see, and five percent less for conservatives.

But they found individual choice “has a larger role in limiting exposure” to alternative viewpoints than ranking. The algorithm produced, on average, a one percentage point decrease in the proportion of stories that challenged users’ beliefs, but individuals’ own choice of what to click on resulted in a four-percent drop.

We are, in effect, our own filters.

Nevertheless, as the researchers note, glancing at a summary of a piece written from a different point of view—even if you choose not to actually read it—provides evidence that not everyone thinks the way you do. There is surely some benefit to getting such regular reminders.

“Our work,” the researchers conclude, “suggests that the power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals.”

To update an old proverb: You can lead a partisan out of the echo chamber, but you can’t make them click.

Originally published in Findings, a daily column by Pacific Standard staff writer Tom Jacobs, who scours the psychological-research journals to discover new insights into human behavior, ranging from the origins of our political beliefs to the cultivation of creativity.

GreaterGood Tiny Logo Greater Good wants to know: Do you think this article will influence your opinions or behavior?
You May Also Enjoy
Comments
blog comments powered by Disqus