Apparently fueled by anti-Semitism and the bogus narrative that outside forces are scheming to exterminate the white race, Robert Bowers murdered 11 Jewish congregants as they gathered inside their Pittsburgh synagogue, federal prosecutors allege. But despite long-running international efforts to debunk the idea of a “white genocide,” Facebook was still selling advertisers the ability to market to those with an interest in that myth just days after the bloodshed.
Earlier this week, The Intercept was able to select “white genocide conspiracy theory” as a pre-defined “detailed targeting” criterion on the social network to promote two articles to an interest group that Facebook pegged at 168,000 users large and defined as “people who have expressed an interest or like pages related to White genocide conspiracy theory.” The paid promotion was approved by Facebook’s advertising wing. After we contacted the company for comment, Facebook promptly deleted the targeting category, apologized, and said it should have never existed in the first place.
Our reporting technique was the same as one used by the investigative news outlet ProPublica to report, just over one year ago, that in addition to soccer dads and Ariana Grande fans, “the world’s largest social network enabled advertisers to direct their pitches to the news feeds of almost 2,300 people who expressed interest in the topics of ‘Jew hater,’ ‘How to burn jews,’ or, ‘History of “why jews ruin the world.”’” The report exposed how little Facebook was doing to vet marketers, who pay the company to leverage personal information and inclinations in order to gain users’ attention — and who provide the foundation for its entire business model. At the time, ProPublica noted that Facebook “said it would explore ways to fix the problem, such as limiting the number of categories available or scrutinizing them before they are displayed to buyers.” Rob Leathern, a Facebook product manager, assured the public, “We know we have more work to do, so we’re also building new guardrails in our product and review processes to prevent other issues like this from happening in the future.”
Leathern’s “new guardrails” don’t seem to have prevented Facebook from manually approving our ad buy the same day it was submitted, despite its explicit labeling as “White Supremacy – Test.”
From the outside, it’s impossible to tell exactly how Facebook decides who among its 2 billion users might fit into the “white genocide” interest group or any other cohort available for “detailed targeting.” The company’s own documentation is very light on details, saying only that these groups are based on indicators like “Pages [users] engage with” or “Activities people engage in on and off Facebook related to things like their device usage, purchase behaviors or intents and travel preferences.” It remains entirely possible that some people lumped into the “white genocide conspiracy theory” fandom are not, in fact, true believers, but may have interacted with content critical of this myth, such as a news report, a fact check, or academic research on the topic.
But there are some clues as to who exactly is counted among the 168,000. After selecting “white genocide conspiracy theory” as an ad target, Facebook provided “suggestions” of other, similar criteria, including interest in the far-right-wing news outlets RedState and the Daily Caller — the latter of which, co-founded by right-wing commentator Tucker Carlson, has repeatedly been criticized for cozy connections to white nationalists and those sympathetic to them. Other suggested ad targets included mentions of South Africa; a common trope among advocates of the “white genocide” myth is the so-called plight of white South African farmers, who they falsely claim are being systematically murdered and pushed off their land. The South African hoax is often used as a cautionary tale for American racists — like, by all evidence, Robert Bowers, the Pittsburgh shooter — who fear a similar fate is in store for them, whether from an imagined global Jewish conspiracy or a migrant “caravan.” But the “white genocide” myth appears to have a global appeal, as well: About 157,000 of the accounts with the interest are outside of the U.S., concentrated in Africa and Asia, although it’s not clear how many of these might be bots.
A simple search of Facebook pages also makes plain that there are tens of thousands of users with a very earnest interest in “white genocide,” shown through the long list of groups with names like “Stop White South African Genocide,” “White Genocide Watch,” and “The last days of the white man.” Images with captions like “Don’t Be A Race Traitor” and “STOP WHITE GENOCIDE IN SOUTH AFRICA” are freely shared in such groups, providing a natural target for anyone who might want to pay to promote deliberately divisive and incendiary hate-based content.
A day after Facebook confirmed The Intercept’s “white genocide” ad buy, the company deleted the category and canceled the ads. Facebook spokesperson Joe Osborne provided The Intercept with the following statement, similar to the one he gave ProPublica over a year ago: “This targeting option has been removed, and we’ve taken down these ads. It’s against our advertising principles and never should have been in our system to begin with. We deeply apologize for this error.” Osborne added that the “white genocide conspiracy theory” category had been “generated through a mix of automated and human reviews, but any newly added interests are ultimately approved by people. We are ultimately responsible for the segments we make available in our systems.” Osborne also confirmed that the ad category had been used by marketers, but cited only “reasonable” ad buys targeting “white genocide” enthusiasts, such as news coverage.
Facebook draws a distinction between the hate-based categories ProPublica discovered, which were based on terms users entered into their own profiles, versus the “white genocide conspiracy theory” category, which Facebook itself created via algorithm. The company says that it’s taken steps to make sure the former is no longer possible, although this clearly did nothing to deter the latter. Interestingly, Facebook said that technically the white genocide ad buy didn’t violate its ad policies, because it was based on a category Facebook itself created. However, this doesn’t square with the automated email The Intercept received a day after the ad buy was approved, informing us that “We have reviewed some of your ads more closely and have determined they don’t comply with our Advertising Policies.”
Still, the company conceded that such ad buys should have never been possible in the first place. Vice News and Business Insider also bought Facebook ads this week to make a different point about a related problem: that Facebook does not properly verify the identities of people who take out political ads. It’s unclear whether the “guardrails” Leathern spoke of a year ago will simply take more time to construct, or whether Facebook’s heavy reliance on algorithmic judgment simply careened through them.