Facebook Outreach Tool Ignores Black Lives Matter

A targeting system to find “preferred audiences” for posts, news stories, and videos won’t help publishers and groups connect to the well-known protest movement.

Penn State student Zaniya Joe wears a piece of tape over her mouth that says "Black Lives Matter" during a Ferguson protest organized by a group of Penn State University students on Tuesday, Dec. 2, 2014, in University Park, Pa. (Nabil K. Mark/Centre Daily Times/TNS via Getty Images)
Penn State student Zaniya Joe wears a piece of tape over her mouth that says "Black Lives Matter" during a Ferguson protest organized by a group of Penn State University students on Tuesday, Dec. 2, 2014, in University Park, Pa. (Nabil K. Mark/Centre Daily Times/TNS via Getty Images) Photo: Nabil K. Mark/Centre Daily Times/TNS/Getty Images

When Facebook launched a new system in January to help news outlets and other groups target posts to particular audiences, a representative of the New York Times said it had the potential to kindle “vibrant discussions” within “niche Facebook communities” that might otherwise get lost amid the social network’s 1.6 billion users. And indeed, in the system’s first several months, software algorithms have generated hundreds of thousands of special tags for connecting to even the most obscure groups, including 7,800 Facebook users who are interested in “Water motorsports at the 1908 Summer Olympics.”

But there’s one set of people who can’t be reached via Facebook’s system: those interested in Black Lives Matter, the nationwide grassroots movement protesting police violence against black people.

Facebook’s targeting mechanism, designed to route articles and videos from Facebook Pages into users’ news feeds, will gladly help reach other protest communities. For example, publishers can target people interested in the conservative Tea Party or in its largely forgotten liberal response, the “Coffee Party,” as well as those enthusiastic about the “Fight for $15” labor movement and the “Boycott, Divestment and Sanctions (BDS) Movement” around Israeli mistreatment of Palestinians, all by attaching to their posts selectors known as Preferred Audience interest tags.

That there’s no such tag for Black Lives Matter is particularly baffling given that BLM began on Facebook in 2013 and has dominated headlines ever since.

Facebook tells The Intercept that the omission does not reflect its own stance toward the movement and blamed the lack of a Black Lives Matter targeting tag on the software that automatically generates the tags.

But BLM’s absence from Facebook’s targeting program looks all the more stark in the wake of high-profile revelations from a Facebook news curator who told Gizmodo that he and his colleagues had to “inject” Black Lives Matter into Facebook’s “Trending News” section because it was having a difficult time gaining traction. It also hearkens back to a controversial moment in 2014, when protests in Ferguson, Missouri, over the police killing of Michael Brown blanketed Twitter feeds. Facebook feeds were instead saturated with posts about the “ice bucket challenge,” a boisterous viral campaign to raise awareness for the brain disease ALS. The success of that effort, which according to the ALS Association raised $115 million — nearly six times the group’s annual budget — shows what a favorable algorithm ranking can do for a campaign.

On the other hand, algorithms can also have a potentially crushing effect on social and political movements, which increasingly rely on social media and journalism to grow and sustain their support bases. By providing a tag to target a particular group, Facebook encourages the production of content for that group. And good reporting strengthens political movements and shapes public discourse.

Social media teams, including ours at The Intercept, use interest tags to promote their journalism and expand their reach. If said journalism isn’t reaching its intended audience — and if a publication’s traffic reflects that — outlets are disincentivized from investing limited resources into covering the movement and the issues its followers care about.

According to Christian Sandvig, an associate professor at the University of Michigan who studies the cultural consequences of algorithmic systems, groups like Black Lives Matter may be missing from the system because Facebook’s programmers wrote a “machine-learning” algorithm — based on artificial intelligence — that produces results even Facebook does not understand.

“Machine learning writes its own software. [It] writes its own rules,” he said. “The reason that a particular item or content is selected or not selected may not be recoverable.”

Lending credence to this theory, Facebook said there is no way to know with any certainty why any specific interest tag is included or missing from its list. “We’re committed to and working on improving our system to generate a more comprehensive list of interests that are relevant for people and useful for publishers,” said a Facebook spokesperson.

Companies that write software like the Preferred Audience system often act like “there’s no human intervention — as though writing software wasn’t human,” said Sandvig. “If you ran a business that did something like that you wouldn’t necessarily have this defense.”

Part of the problem may actually be that Facebook is eager to create pleasant user experiences. Its News Feed algorithm brings people content they’re expected to like and hides content that might make them unhappy. So if a lot of users block or hide posts related to Black Lives Matter because they find the violent or controversial nature of the issue objectionable, that can affect how visible other Black Lives Matter content is.

“Because we learn about the world through the social media algorithm, in the future we might be learning about a new kind of world,” said Sandvig. “One that reflects certain decisions — made by internet platforms — about what kind of mood they want us to be in or what feeling they want us to have while using them.”

“The computer and the user coproduce relevance,” said Sandvig. “You’re training the algorithm and it’s training you.”

Join The Conversation