Facebook study disputes the 'echo chamber' effect

08.05.2015
On Facebook, you might see a fair number of news articles shared by friends supporting Republican presidential candidates as the race heats up. Even if you're a Democrat.

The site's News Feed ranking, which can control the posts users see based on their personal information and activity on the site, has produced what some have called a "filter bubble" or "echo chamber": a homogenous stream composed primarily of like-minded posts, leaving little room for other points of view. Questions around the diversity of content in News Feed are not going away, given that many people now get their news on Facebook.

But it may not be as bad as it seems. The results of a new study by data scientists at Facebook, published Thursday in the journal Science, says that while much of the content people see on Facebook is aligned with their own ideology, a fair amount of it does represent opposing viewpoints.

The results, Facebook says, quantify for the first time the extent to which people are exposed to ideologically diverse news and information in social media.

Nearly 30 percent of all the news content that people see in the News Feed cuts across ideological lines, Facebook said. This means that nearly 30 percent of that content is shared by users who identified themselves as conservative on Facebook but was seen by users who identified themselves as liberal, and vice versa.

Counting just the content shared by people's friends reveals about the same percentage that cuts across ideological lines, the study said.

Facebook's algorithm that ranks results in the News Feed is designed to surface content to users that's aligned with what they're interested in, based on their activity. The results of this study help to show how much Facebook users do actually engage with content that reflects different points of view.

Nearly 25 percent of the news articles that people click on cut across ideological lines, Facebook said.

For the study, Facebook researchers developed a system that identified more than 226,000 news articles shared at least 100 times during the second half of 2014. The company wanted to see how often people were exposed to stories about politics, world affairs and the economy, which Facebook designated as being either conservative or liberal content, among liberal and conservative audiences.

The study has its limitations. It only looked at Facebook users who identified themselves as conservative or liberal in some way, which is less than 1 percent of the company's total user base. The study also did not look at whether the articles that were shared changed people's political views or habits.

The effect of algorithms like Facebook's on what people see online has generated controversy lately.

The Federal Trade Commission is looking at the issue of "algorithmic transparency," to assess the deeper incentives behind algorithms on sites like Google and Facebook, and how they affect people.

Zach Miners covers social networking, search and general technology news for IDG News Service. Follow Zach on Twitter at @zachminers. Zach's e-mail address is zach_miners@idg.com

Zach Miners

Zur Startseite