Facebook users, by and large, are not very good at differentiating between what’s fact and what’s false. Many users will eagerly share both reliable news and the fake stuff without any hesitation. It happens because users either want the falsehoods to be received as true or simply can’t tell the difference. Rampant media illiteracy is the root cause of the fake news handwringing we’ve been dealing with since before the election and will be fretting over until the end of time (or the end of Facebook, whichever comes first). Today, Facebook honcho Mark Zuckerberg said he is setting out to fix this fundamental problem of digital media illiteracy — by putting more power in the hands of the illiterate.
In a new Facebook post today, Zuckerberg said he “asked our product teams to make sure we prioritize news that is trustworthy, informative, and local.” Why this has only become a priority in the company’s 14th year of existence is left unsaid. Zuckerberg admitted that “there’s too much sensationalism, misinformation and polarization in the world today,” and that his website “enables people to spread information faster than ever before.” As with the rest of Silicon Valley, Facebook is obsessed with the appearance of machine-like objectivity, and so Zuckerberg said figuring out which outlets deliberately package viral-ready falsehoods and which do not is a head-scratcher (spoiler: It isn’t):
The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that’s not something we’re comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you — the community — and have your feedback determine the ranking.
So, rather than relying on the subjectivity and biases of a team of outside experts, Facebook will rely on the subjectivity and biases of 2 billion people around the world. Specifically, Facebook said it will decide which media outlets are prioritized at least in part by just asking people which outlets they like:
As part of our ongoing quality surveys, we will now ask people whether they’re familiar with a news source and, if so, whether they trust that source. The idea is that some news organizations are only trusted by their readers or watchers, and others are broadly trusted across society even by those who don’t follow them directly. (We eliminate from the sample those who aren’t familiar with a source, so the output is a ratio of those who trust the source to those who are familiar with it.)
Facebook is either unaware of — or, more likely — unwilling to deal with the fact that people have rabid, tribalistic loyalties to certain outlets. Someone who enjoys sharing, say, the Daily Caller or InfoWars articles is going to, of course, say that these are trustworthy outlets. Otherwise, they’re admitting that they voluntarily consume and spread information that isn’t trustworthy, and we all think too highly of ourselves for that. According to a Facebook spokesperson, the surveys are meant to make sure “people can have more from their favorite sources and more from trusted sources.” Isn’t part of the Facebook information disaster that so many people count things like RedStateEagleMilitiaZoneDeepStateNews (or what have you) among their “favorite sources?” Should we be asking these people what’s trustworthy and what isn’t? Should they be deciding what will appear on your feed — or even their own — as reliable news?
Similarly, no one who posts five MSNBC articles every day is going to even consider giving Fox News a vote of trustworthiness. In fact, partisan news consumers will relish an opportunity to boost their side and downvote the bad guys, a cherished internet pastime. Rather than fix the enormous, world-spanning information morass they’ve created, Facebook is punting responsibility to its users (and, of course, the almighty Algorithm).
Facebook is still as reluctant to do anything that will cause it institutional discomfort or provoke backlash from its right-and-left-aligned users.
Details on how exactly these surveys will function are scant for now, though the Facebook spokesperson told me these new “changes that are not intended to directly impact any specific groups of publishers based on their size or ideological leanings.” The spokesperson added, “We do not plan to release individual publishers’ trust scores because they represent an incomplete picture of how each story’s position in each person’s feed is determined.”
It’s also unclear how this would affect small or new outlets that have little to no name recognition because they’re small or new. What is clear is that Facebook in the new year is still as reluctant to do anything that will cause it institutional discomfort or provoke backlash from its right-and-left-aligned users. So long as Facebook remains a corporate monolith with immense control over the entire worldwide media industry, this problem won’t go away.
At the outset of the year, Zuckerberg declared it his personal challenge to fix what’s broken at his company. Today, he said to everyone, Here, you deal with it.