Nov 14 2017
Fact-checking on Facebook
Last year Facebook announced that it was partnering with several outside news agencies, the Associated Press, Snopes, ABC News, PolitiFact and FactCheck.org, to fact-check popular news articles and then provide a warning label for those articles on Facebook. How is that effort working out?
According to a recent survey, not so well. Yale researchers Rand and Pennycook found only tiny effects overall, and it’s possible there is a net negative effect from the warning labels. Some people just ignore the labels. Perhaps more significant, however, is the fact that fake news articles that were missed by the fact-checkers were more likely to be believed as real because they lacked the warning label. The fact-checkers could not possibly keep up with all the fake news, so they were overwhelmed and most of the dubious content not only made it through the filters, but benefited from a false implication of legitimacy.
Further, the Guardian reports that this arrangement between Facebook and these news outlets compromise the ability of those news outlets from being a proper watchdog on Facebook itself. If their journalists are being paid by Facebook to fact-check, then they have a conflict of interest when reporting on how Facebook is doing. This conflict is exacerbated by the fact that news organizations are hard-up for income, and could really use the extra income from Facebook.
So it seems that the fact-checking efforts of Facebook were insufficient to have any really benefit, and may have even backfired. Warning labels on dubious news articles may be the wrong approach. It’s simply too easy to foil this protection by overwhelming the system. You could even deliberately flood Facebook with outrageously fake news stories to serve as flack and provide cover for the propaganda you really want to get through. In the end the propaganda will be even more effective.
The inherent problem seems to stem from the difference between a pre-publication editorial filter and a post-publication filter. Traditional journalism has editors and standards, at least in theory, that require vetting and fact-checking prior to a story being published. Outlets had an incentive to provide quality control in order to protect their reputation.
Of course tabloids also have a long history. They take a different strategy – abandoning any pretense to journalistic integrity, and simply spreading outrageous rumors or fabricated “infotainment.” At least it was relatively easy to tell the difference between a mainstream news outlet and a tabloid rag, although there is more of a spectrum with the lines blurred in the middle.
No one would seriously claim that this system was perfect. News organizations have their editorial biases, and they had a lot of control over what became news. Biases tended to average out over many outlets, however. The big concern was over consolidation in the media industry, giving too much power to too few corporations.
Social media has now upended this system. There is now, effectively, no pre-publication editorial filter. The infrastructure necessary to own and operate a news outlet is negligible, and social media creates a fairly level playing field. It is an interesting giant social experiment, and I don’t think we fully know the results.
What this means is that ideas spread through social media mostly according to their appeal, rather than due to any executive decisions made by gatekeepers. There are still power brokers – people who have managed to build a popular site and have the ability to dramatically increase the spread of a particular news item. That, now, is the name of the game – clicks, followers, and likes. This equals power to spread the kind of memes and news items that will generate more clicks, followers, and likes.
The free-market incentive, therefore, is for click-bait, not necessarily vetted quality news. Quality is still a factor, and will give an article a certain amount of clicks. My perception is that there are multiple layers of information on social media. There are subcultures that will promote and spread items that appeal to them. They may appeal to them because they are high-quality, or because they are genuinely entertaining. Or they may appeal because they cater to a particular echochamber or ideology.
So, if you love science, you can find quality outlets for science news and analysis. Within these subcommunities, quality may actually be a benefit and the cream does rise to the top.
But sitting on top of these relatively small subcommunities is the massive general populace, which rewards memes and clickbait. That is the realm of fake news and cat videos – entertaining fluff and outrageous tabloid nonsense. This realm is also easily exploited by those with an agenda wishing to spread propaganda – click-bait with a purpose.
Facebook, as the major medium for this layer of fake news, now faces a dilemma. How can and should they deal with it? The outsourced fact-checkers strategy is, if the recent survey is accurate, a relative failure. So now what?
I feel we can do better than to just throw up our hands and let this new system play itself out. Sometimes market forces lead to short term advantages but long term doom. Can our democracy function without a well-informed electorate? Can our electorate be well-informed in an age of fake-news? The entire situation is made worse by the fact that the very concept of fake news is used to further spread propaganda, to delegitimize actual journalism and dismiss any inconvenient facts.
Can we properly function as a society if we don’t at least have a shared understanding of reality, at least to the point that there are some basic facts we can agree on? Recent history does not fill me with confidence.
I don’t have the solution, but I do think that the large social media outlets should take the problem seriously and continue to experiment. Overall I think we need to find the proper balance between democracy of information, transparency, and quality control. Right now the balance has shifted all the way toward democracy, with a massive sacrifice of transparency and quality control. I don’t think this is sustainable.
There are, of course, things we can do as individuals – such as supporting serious journalism, and not spreading click-bait online. Everyone needs to be more skeptical, and to vet news items more carefully, especially before spreading them to others. But this is a band-aid. This is like addressing the obesity crisis by telling everyone to eat less and exercise.
We need systemic change. It’s an interesting problem, but there are certainly ways to at least improve the situation.