Facebook Brushed Off the U.N. Five Separate Times Over Calls for Murder of Human Rights Worker

A U.N. report said a post about the “Muslim ... national traitor” was shared over 1,000 times and that comments called for the person to be killed.

TOPSHOT - A car passes by Facebook's corporate headquarters location in Menlo Park, California, on March 21, 2018. Facebook chief Mark Zuckerberg vowed on March 21 to "step up" to fix problems at the social media giant, as it fights a snowballing scandal over the hijacking of personal data from millions of its users. / AFP PHOTO / JOSH EDELSON (Photo credit should read JOSH EDELSON/AFP/Getty Images)
Facebook's corporate headquarters in Menlo Park, Calif., on March 21, 2018. Photo: Josh Edelson/AFP/Getty Images

Facebook’s total inability to keep itself from being a convenient tool for genocidal incitement in Myanmar has been well-covered, now a case study in how a company with such immense global power can so completely fail to use it for good. But a new report released this week by the United Nations fact-finding mission in Myanmar, where calls for the slaughter of Muslims have enjoyed all the convenience of a modern Facebook signal boost, makes clear just how unprepared the company was for its role in an ethnic massacre.

In a recent New Yorker profile of Facebook founder and CEO Mark Zuckerberg, he responds to his company’s role in the crisis — which the U.N. has described as “determining” — with all the urgency and guilt of a botched restaurant order: “I think, fundamentally, we’ve been slow at the same thing in a number of areas, because it’s actually the same problem. But, yeah, I think the situation in Myanmar is terrible.” Zuckerberg added that the company needs to “move from what is fundamentally a reactive model” when it comes to blocking content that’s fueled what the U.N. described last year as a “textbook example of ethnic cleansing.”

The new report reveals just how broken this “reactive model” truly is.

According to the 479-page document, and as flagged in a broader Guardian story this week, “the Mission itself experienced a slow and ineffective response from Facebook when it used the standard reporting mechanism to alert the company to a post targeting a human rights defender for his alleged cooperation with the Mission.” What follows is the most clear-cut imaginable violation of Facebook’s rules, followed by the most abject failure to enforce them when it mattered most:

The post described the individual as a “national traitor”, consistently adding the adjective “Muslim”. It was shared and re-posted over 1,000 times. Numerous comments to the post explicitly called for the person to be killed, in unequivocal terms: “Beggar-dog species. As long as we are feeling sorry for them, our country is not at peace. These dogs need to be completely removed.” “If this animal is still around, find him and kill him. There needs to be government officials in NGOs.” “Wherever they are, Muslim animals don’t know to be faithful to the country.” “He is a Muslim. Muslims are dogs and need to be shot.” “Don’t leave him alive. Remove his whole race. Time is ticking.” The Mission reported this post to Facebook on four occasions; in each instance the response received was that the post was examined but “doesn’t go against one of [Facebook’s] specific Community Standards”. The Mission subsequently sent a message to an official Facebook email account about the matter but did not receive a response. The post was finally removed several weeks later but only through the support of a contact at Facebook, not through the official channel. Several months later, however, the Mission found at least 16 re-posts of the original post still circulating on Facebook. In the weeks and months after the post went online, the human rights defender received multiple death threats from Facebook users, warnings from neighbours, friends, taxi drivers and other contacts that they had seen his photo and the posts on Facebook, and strong suggestions that the post was an early warning. His family members were also threatened. The Mission has seen many similar cases where individuals, usually human rights defenders or journalists, become the target of an online hate campaign that incites or threatens violence.

This is a portrait of a system of rules — from a company that oversees the online life of roughly 2 billion people — that is completely broken, not merely flawed. Had someone at the U.N. mission not had a “contact at Facebook” who could help, it’s easy to imagine that the post in question would have never been taken down — not that it mattered, given that it was soon re-posted and shared with impunity. Facebook’s typical mea culpa asserts that the company regrets being “too slow” to curb these posts, when it fact it had done something worse by creating the illusion of meaningful rules in the first place.

It says everything about Facebook’s priorities that it would work so hard to penetrate poorer, “emerging” markets, while creating conditions under which an “unequivocal” call to murder “Muslim animals” would be considered in compliance with its rules. The company, which reportedly had fewer than five Burmese-speaking moderators in 2015, now says it’s hiring a fleet of new contractors with language skills sufficient to field such reports, but Zuckerberg et al have done little to convince the world that it’s learned anything from Myanmar. As usual, Facebook will slowly clean up this mess only after it’s been sufficiently yelled at.

Top photo: Facebook’s corporate headquarters in Menlo Park, Calif., on March 21, 2018.

Join The Conversation