Facebook denies bias in Trending Topics, but promises changes nevertheless

23.05.2016
Facebook on Monday denied any evidence of ‘systematic political bias’ in the selection or prominence of stories included in its Trending Topics feature, but promised changes in its processes in any case to minimize errors of judgment by individuals, who appear to play an important role in selecting the stories.

The move by the social networking company was outlined in a response to a letter from Senator John Thune, chairman of the U.S. Senate Commerce Committee, in the wake of a report in tech blog Gizmodo that cited anonymous sources as saying that Facebook was holding back stories with conservatives views from Trending Topics and instead injecting stories that weren’t as popular. The company decided to investigate the matter internally.

"At the same time, as you would expect with an inquiry of this nature, our investigation could not exclude the possibility of isolated improper actions or unintentional bias in the implementation of our guidelines or policies," Facebook’s General Counsel Colin Stretch said in a letter to Thune that explains the prominent role human intervention plays in the selection of topics and stories.

"We have rigorous guidelines that do not permit the prioritization of one viewpoint over another or the suppression of political perspectives," CEO Mark Zuckerberg had said earlier this month after the controversy broke.

The company plans to have more controls and oversight around the review team, including “robust escalation procedures,” Facebook said in the letter to Thune. The social network also said it would eliminate  its reliance on external websites and news outlets to identify, validate, or assess the importance of trending topics.

Launched in January 2014, Trending Topics, appears in the upper right corner of the Facebook website. It helps people discover topics that are both popular and meaningful to them and is separate from the News Feed, which Facebook describes as its central component and the primary information distribution channel with which Facebook users engage.

The algorithm used for Trending Topics detects unusual increases in the number of posts about a particular subject over time. Such topics are then added to a review queue of potentially trending topics, where half of the topics recommended by the algorithm have to be weeded out by human reviewers from Accenture that work on this, Stretch wrote to Thune.

“We currently use people to bridge the gap between what an algorithm can do today and what we hope it will be able to do in the future—to sort the meaningful trends from gibberish and duplicates, and to write headlines and descriptions in clear, natural-sounding language,” Stretch wrote.

The corrective procedures used by the reviewers include boosting the prominence of a topic and topic injection, where the reviewer may need to use a tool to fix a topic that the algorithm has incorrectly identified.

“Facebook’s description of the methodology it uses for determining the trending content it highlights for users is far different from and more detailed than what it offered prior to our questions. We now know the system relied on human judgment, and not just an automated process, more than previously acknowledged,” Thune said in a statement.

Thune thanked Facebook for its efforts to acknowledge relevant facts and its recognition of a continuing need to transparently address relevant user questions.

Facebook has promised, among other measures, refresher training for reviewers that will emphasize that content decisions should not be made on the basis of politics or ideology. It will also revise terminology in the guidelines to make them more clear.

John Ribeiro

Zur Startseite