Facebook ‘Trending’ List Skewed by Individual Judgment, Not Institutional Bias

SAN FRANCISCO — Last July, in a seventh-floor conference room at Facebook’s Lower Manhattan offices, a small group met to discuss the future of news media on the social network.

Facebook leaders were bullish on a relatively new section of the site that surfaced the most popular news stories, like news of the terrorist attacks on the Charlie Hebdo newspaper in Paris or stories about Chris Hemsworth’s genitalia. They decided the effort, called Trending Topics and until then a pilot operation by a dozen or so staff members, should be doubled to more than 30 people. One goal for the team: use human judgment to make algorithms better at finding news on Facebook.

“We asked, do we consider ourselves Facebook journalists?” said Benjamin Fearnow, a former news curator at Facebook who worked on Trending Topics for close to a year, until April, and who attended the meeting. “We straddled that very thin line between social media and news. None of us really knew how it was going to play out.”

Nearly a year later, what began as a tiny experiment for Facebook has swollen into a national headache. The Silicon Valley company faces allegations of intentionally suppressing conservative news from appearing on Trending Topics. In a rough-and-tumble presidential election year in which social media is playing an increasingly large role, some Republican leaders say they have lost trust in Facebook’s ability to maintain impartiality as a communication and news platform.

An examination of Trending Topics, based on interviews with current and former Facebook employees and a demonstration of the curator tools, found that Facebook’s employees were not directed to squelch conservative news on the site, nor would that be easily accomplished by a staff member who wished to do so.

Instead, these people said, Trending Topics was a fledgling, ill-managed group — made up largely of recent college graduates with little work experience — where individual judgment of news was encouraged. That led to inconsistencies in how the most popular stories were presented, along with departures from the team, eventually landing the group in the controversy that spotlights Facebook’s huge role in the type of information people see every day.

“We’ve reflected a lot,” said Justin Osofsky, Facebook’s vice president for global operations and media partnerships, which includes Trending Topics. He said Facebook had “identified ways in which we should have improved.”
Trending Topics, introduced in a handful of countries in January 2014 with a small staff based in New York, was Facebook’s first major attempt to comb through the avalanche of information being posted on the social network and to make it easier for people to find current events — such as the pope’s visit to the United States and anything involving the Kardashians — and to read and talk about them on Facebook.

It was a shot at competitors like Google and Twitter, according to two former news curators who spoke on condition of anonymity because they had signed nondisclosure agreements. Facebook wanted people to search for more content — like news — on its own site instead of on Google, the search king, or Twitter, which was widely regarded as better for real-time news, they said.

There was one big problem: Facebook’s trending algorithms, which identify the most-talked-about terms, were not very good at discerning what was and was not news. Left to their own devices, roughly 40 percent of what Facebook’s algorithms dug up would be junk or “noise,” a result of many people using the same word at the same time across the network. The algorithm might pick up a sharp rise in the word “Skittles” and deem it a trending topic — not exactly the events Facebook had in mind.

That is where humans came in. Facebook enlisted a set of 20-somethings as curators, copy editors and team leads, charged with sifting through the material the algorithms unearthed. They were crucial, they were told, to improving Facebook’s ability to discern, over time, what constitutes news.

“Even if you want to have computers do everything, for technical reasons, resource limitations and product positioning, you may want humans to oversee the algorithms,” Jonathan Koren, a former Facebook employee who worked on algorithmic ranking for Trending Topics, wrote in a LinkedIn post this week.

Mr. Fearnow, who was terminated from Facebook in April for breaking his nondisclosure agreement, said his job as a curator was to “massage the algorithm.” Managers were ambivalent about allowing staff members to identify themselves as curators or editors on their LinkedIn profiles, he said, given concerns that outsiders would notice the element of human judgment and ask questions about it.

Facebook declined to comment, citing employee confidentiality.

During each eight-hour shift, curators were presented with a continuously refreshing list of trending terms they had to sort through and identify as junk or relevant, and draft descriptions for those topics. After labeling them and checking whether they had been independently reported by a number of major news outlets, curators gave topics a value that would make them more or less likely to show up on individual users’ pages. Each user saw a different list of personalized Topics based on their past actions on Facebook.

In Facebook’s editorial guidelines, curators were also told to “blacklist,” or push aside, junk topics that appeared in their queue for a period of eight to 24 hours before they could potentially appear again, according to current and former employees. When duplicate or confusing topics arose, curators were told to “inject” a more accurate Topic term. Copy editors and team leads would also oversee and approve the choices being made.

The work was monotonous — and not entirely gratifying. Workers were incentivized to compete against one another to clear the most trends from their queue, former employees said. Top performers were given “points” that could be spent on Facebook paraphernalia like T-shirts.

Staff turnover was high, they said. Most on the team were contractors, though some were full-time employees. That led to a sense of inequality between the two groups, some said. Starting annual salaries for contractors were in the mid-to-high $50,000s, according to former staff members.

The disgruntlement may have led to leaks to news media. On May 9, the news site Gizmodo quoted anonymous sources who claimed to be former Facebook journalists, alleging that curators “routinely suppressed news stories of interest to conservative readers” in Trending Topics.

Facebook denied the allegations. In a demonstration of its internal tools this week at its headquarters in Menlo Park, Calif., Facebook showed multiple layers of scrutiny by staff members that would make it extremely difficult for other workers not to notice an intentional manipulation of the system, nor was it likely to be technically possible.

Former employees agree. “It’s hard to see any kind of intentional, outright bias,” Mr. Fearnow said. “There were an amazing amount of steps taken to avoid that.” He said he regularly wrote headlines and descriptions for far-right conservative sites or personalities that were visible on Trending Topics.

Still, current and former Facebook employees said the personal judgment of curators was integral. Unconscious bias, they said, is tough to prevent. Facebook’s 28-page editorial guidelines were also constantly updated and changed, former curators said, and engineers would tweak the product with little or no notice, making it difficult to keep up.

The reaction to the Gizmodo report has been fiery. Senator John Thune, Republican of South Dakota, has demanded answers about how Trending Topics works. A group of 16 conservatives, including Glenn Beck and Brent Bozell, traveled to Facebook for a fence-mending meeting with top executives, including the chief executive, Mark Zuckerberg. Some conservative sites, like Breitbart, are calling for Mr. Zuckerberg to further explain himself.

Facebook said it was improving employee training to make its guidelines for news curators clearer, including renaming some of the more nefarious-sounding terminology in its guidelines, changing “blacklist” to “revisit,” and “injection” to “topic correction.” The company continues to investigate whether any curators intentionally tried to suppress conservative news.

Executives said that for an experimental product, they got a lot wrong.

“This is a relatively early product,” Mr. Osofsky said. “There are aspects of it that clearly we’re continuing to work on.”

 

News: THE NEW YORK TIMES

Leave a Reply

Your email address will not be published. Required fields are marked *