Is Your Facebook Feed an Echo Chamber?



facebook-logoFacebook recently did some research in order to discover exactly how much individuals could be, and are, exposed to ideologically diverse news and information in social media. People are increasingly turning to social media for news. Is it your selection of friends, or Facebook’s algorithms, that have the most influence on what you see in your News Feed?

The Facebook researchers looked at individuals who use Facebook and who self-identified as either a liberal or as a conservative. They found that 9% of Facebook users in the United States classified themselves as either a liberal or a conservative.

The researchers wanted to find out how much people were being exposed to “hard news” (articles about politics, world affairs, and the economy), rather than “soft news” (stories about entertainment, celebrities and sports). They also wanted to know whether the information in the articles were aligned primarily with liberal or conservative audiences.

The researchers found that, on average, 23% of people’s friends claim an opposing political ideology. They found that 29% of the hard news content that people’s friends share cuts across ideological lines. It turned out that 28.9% of the hard news that Facebook users saw in their News Feed cut across ideological lines. The researchers also found that 24.9% of the hard news content people actually clicked on cut across ideological lines.

What does all this mean? Facebook says that the composition of a person’s social network is the most important factor affecting the mix of content encountered on social media. Individual choice also plays a large role. Facebook says the News Feed ranking has a smaller impact on the diversity of information a person sees from the opposing ideological viewpoint than does who they have selected as friends.

In other words, Facebook says that the friends you choose have more of an influence on what you see on Facebook than does the News Feed algorithm. You could be, intentionally or unwittingly, creating an echo chamber by only friending people who match your ideological viewpoint.

On the other hand, there’s an interesting article on Medium that takes a look at Facebook’s study. Eli Pariser points out that the Facebook research was done on just 9% of Facebook users (a small number of overall users), and that those users could behave differently on Facebook than people who don’t identify themselves as either liberal or conservative. He also notes that since this was done by Facebook scientists, the study is not reproducible – at least, not without Facebook’s permission to reproduce it.